Experiencing Data Latency for Many Data Types - 5/30 - Investigating

Update: Saturday, 5/30/2015 20:11 UTC We continue to work for restoring the system health. We don't see any errors since 05/30 19:45 and latency is going down slowly for affected streams. Root cause is not yet fully confirmed and investigation is continue to understand the problem and apply a fix accordingly. • Next Update: Before 6/1 00:00 UTC -Application Insights Service Delivery Team Update: Saturday, 5/30/2015 17:50 UTC Our DevOps team continues to investigate issues within Application Insights. Root cause is not fully understood at this time. Some customers continue to experience data latency for telemetry data . We are working to establish the start time for the issue, initial findings indicate that the problem began at 05/30 ~15:36 UTC. We currently have no estimate for resolution. • Next Update: Before 20:00 UTC -Application Insights Service Delivery Team    

Posted by on 30 May 2015 | 12:54 pm

Creating a “Real World” Database in Azure for Advanced Analytics Exploration: Part 1

One of the major issues that people who want to get started with "Data Science" or "Big Data Analytics" face is finding datasets that are both compelling and diverse enough to provide a useful playground for exploration. In my previous life, I blogged about this subject and wrote instructions on how to create a large SQL Server database from data available from the Home Mortgage Disclosure Act (HMDA). You can read that entry here although I'm not 100% positive that the steps I've laid out in that blog are still valid in terms of data locations and availability. The purpose of this series of posts is to provide instructions on how to create a SQL database in Microsoft Azure that can be used as a data source for advanced analytics. I'll be posting future entries on this blog that refer back to this database. Also, to be clear, while I'm focusing on creating the database in Microsoft Azure, there is no reason why you couldn't use the same techniques to create the database on a local instance of SQL Server. For those that want to use Azure but don't have a subscription, you can sign up for a 30-day trial here. If all you want to do is host a simple database in Azure, you can do so for around US $5 per month, up to 2GB in size. For a larger database, you'll want to look at the standard tier, which starts at US $15 per month for 250GB. If you follow the instructions that I lay out in this post, you'll end up with a database of about 6GB in size. For this first article in the series, I will discuss setting up the database environment in Azure and downloading the initial data. I will also discuss the initial loading of the data into a temporary table in the database. Additional articles in this series will focus on turning the temporary table into a useful data structure as well as analyzing the data. The Use Case For this series, the use case I'll be working with is crime, based on data from the city of Chicago. The reason I chose this particular dataset is that it's a subject that most people can relate to in one way or another, and it does lend itself to some advanced analytics capabilities very well. The City of Chicago maintains a data portal where they publish crime detail on a regular basis. This is significant because the level of detail published is very granular – down to the individual police report. When combined with other data, such as NOAA Weather data, there are a number of interesting analytics possibilities that can be realized. The solution that will be developed throughout this series will be surfaced using Microsoft PowerBI and will result in a dashboard that looks like this: Configuring The Database Environment The first step in building the database is to ensure that you're properly setup in the Microsoft Azure portal. Follow the instructions here to sign up for an account (a trial account will work just fine for this, but remember that it's only good for 30 days). Also remember that the instructions I'm providing here will also work with a local SQL Server instance, you'll just have to modify how and where you load your data appropriately. Once you are logged in to the Azure environment, you should have a portal that looks something like this (likely with fewer pre-configured resources however): To create a new database, choose SQL Databases from the left side navigation, and then select New at the bottom of the page, then select Data Services / SQL Database / Quick Create and then fill out the form, choosing a New SQL database server along with a region close to you. Once the form is filled out appropriately, choose Create SQL Database which will then submit everything to Azure and begin the process of provisioning your new database. You will see a status message appear, and it will take a few minutes to complete the process. If you receive a message about enabling auditing on the new database, you can choose to ignore it, unless you want to experiment with auditing throughout this process. Once the database is created, you can click the details link to view the status of the job: Managing the Database Now that the database has been created, you'll want some tools to manage it. The good news there is that the same tools that manage local SQL Server instances work just fine with Azure SQL Database. If you don't already have a local instance of SQL Server Management Studio to work with, you can download a free version here (Click the link to download Express Edition, and then select the appropriate 32 or 64-bit Management Studio option – Follow the instructions to install it on your local machine). Make sure that you install all of the management tools, as you will be using the Import/Export Wizard to populate the initial table in the database. The first step to enable managing your new database is to enable your client IP Address in the Firewall rules. From the Azure Management Portal, choose SQL Databases on the left side, then select the Servers tab, then select the server that you just created. Select the Configure tab, and then select Add to the Allowed IP Addresses. Choose Yes next to Windows Azure Services (you will need this option later) and then choose Save at the bottom of the screen. This will add your local IP address to the firewall rules to enable your local machine to connect to the Azure SQL Database server. Once the firewall rules are saved, you'll use SQL Server Management Studio (SSMS) to manage the database and server. To connect to the database, start SQL Server Management Studio, and when prompted, login to the new Database Server that you created above, using SQL Server Authentication and the username and password that you provided when you initially provisioned the database. (notice that you will use the fully-qualified name of the database server, which is <servername_you_provided>.database.windows.net) Once connected, you should see your server in the Object Explorer window. You can expand Databases to see the database that you provisioned (note: My personal preference is to automatically open a new query window when I start SSMS. The default is to not open a new query. If you want to configure this option, it is available under Tools/Startup ) Once you have successfully connected to the database, you are ready to proceed to importing the data into the initial staging area. Downloading the Crime Data Downloading the data from the City of Chicago is a very easy process. For this initial load, we will download the entire dataset and load it in a single pass. Since the data is updated on a regular basis in the portal, later entries in this series will explain how to keep your data in sync with the portal. Using your browser, connect to the Chicago Data Portal (https://data.cityofchicago.org/ ) and select the Crimes 2001-Present option from the middle pane of the page. This will open the Crimes dataset in the online explorer (which is very nicely done, and allows for a wide-range of analysis directly from the portal). In the upper-right corner of the portal, choose the Export option and then choose CSV for Excel. This will eventually open a Save File dialog (it can take a few minutes to generate the export file) Choose Save, and the file will begin downloading. This will take several minutes to download, depending on your Internet connection speed. Now that the file is downloaded, you can import the data into your Azure SQL Database. Importing The Data Note: The process that we will use to import the data is a very simplistic process. There are more efficient ways to accomplish this task, but I wanted to use the simple and easy approach to load the initial data. To load the initial data, start the SQL Server 2014 Import Export Wizard. (It was installed along with the SSMS tools above and can be found in your Start menu. Make sure you choose the appropriate version of the tool – 64-bit or 32-bit depending on your operating system) When the wizard starts, click Next and then choose Flat File Source. Click the Browse button and select the file that you downloaded in the previous step (in order to see the file in the window, you will need to select the CSV files option next to the File textbox) and then choose Open. In the Text Qualified textbox, enter a double-quote ("). Ensure that the options are configured as in the image above. Choose the Columns option on the left side to ensure that the fields are lining up properly. Once you are sure that the columns are properly lined up, select the Advanced option on the left side and then choose the Description Column. Change the data type to a text stream (DT_TEXT) Then choose the Next button and then select the SQL Server Native Client destination. Enter your Server Name and login information, then select the ChicagoCrime database in the drop down. Click the Next button and change the destination table name to [dbo].[Crimes_New] Choose the Edit Mappings button, and select the Edit SQL button, and then add PRIMARY KEY CLUSTERED to the [ID] column as shown below. Azure SQL Database likes to see Clustered indexes on tables. Choose the OK button and then choose the Next button and then choose Finish twice to start the import operation. This operation will run for several minutes as there are over 5 million rows of data to import. Once the operation is complete, you can switch back to SSMS and verify that the initial table has been created and populated by executing the following query in the ChicagoCrimes database: Preparing for the Next Step In this blog post, we setup a new Azure SQL Database and imported some crime data into a temporary table in the SQL Database. You will note that we did not manipulate the data in any way, so all of the data types are currently inherited from the text file. In the next article in this series, I'll walk through how to create the permanent table and how to populate it with the data, converting the data types to the appropriate values.              

Posted by on 30 May 2015 | 12:14 pm

Operations Management Suite Log Search How To: Part VIII – the IN operator and subsearches

This is the eight installment of a Series that walks thru the concepts of Operations Management Suite (formerly Microsoft Azure Operational Insights) Search Syntax – while the full documentation and syntax reference is here, these posts are meant to guide your first steps with practical examples. I’ll start very simple, and build upon each example, so you can get an understanding of practical use cases for how to use the syntax to extract the insights you need from the data. In my first post I introduced filtering, querying by keyword or by a field’s exact value match, and some Boolean operators. In the second post I built upon the concepts of the first one, and introduced some more complex flavors of filters that are possible. Now you should know all you need to extract the data set you need. In the third post I introduced the use of the pipeline symbol “|” and how to shape your results with search commands. In the fourth post I introduced our most powerful command – measure – and used it with just the simplest of the statistical functions: count(). In the fifth post I expanded on the measure command and showed the Max() statistical function. In the sixth post I continued with measure’s statistical functions – I showed how Avg() is useful with Performance data among other things. In the seventh post I continued with another of measure’s statistical functions – Sum() – and introduced the where command.   I am very excited to be writing about an important addition to our search syntax: we recently enabled the ‘IN’ operator (along with ‘NOT IN’), which allows you to use subsearches: a subsearch is a search that includes another search as an argument. They are contained in curly brackets within another "primary" or "outer" search. The result of a subsearch (often a list of distinct results) is then used as an argument in its primary search. You can use subsearches to match subsets of your data that you cannot describe directly in a search expression, but which can be generated from a search. For example, if you're interested in using one search to find all events from "computers missing security updates" you need to design a subsearch that first identifies that "computers missing security updates" before it finds events belonging to those hosts. So, we could express ‘computers currently missing security updates” as follows: Type:RequiredUpdate UpdateClassification:"Security Updates" TimeGenerated>NOW-25HOURS | measure count() by Computer Once we have this list, we can use this search as inner search to feed the list of computers into an outer (primary) search that will look for events for those computers. We do this by enclosing the inner search in curly brackets and feeding its results as possible values for a filter/field in the outer search using the ‘IN’ operator. This query would look like: Type=Event Computer IN {Type:RequiredUpdate UpdateClassification:"Security Updates" TimeGenerated>NOW-25HOURS | measure count() by Computer} that’s it! Also notice the time filter I used in the ‘inner’ search: since I am counting on ‘System Update Assessment’ Solution to have a snapshot of all machines every 24 hours, I am making that inner query more lightweight and precise by only making that go back in time a day; the outer search instead still respects the time selection in the user interface, pulling up events from the last 7 days. [Also check my previous blog post about time in search] Another thing to notice is that, since we are really only using the results of the inner search as a filter value for the outer one, you can still apply commands in the outer search, i.e. we can still group the above events with another measure command: Type=Event Computer IN {Type:RequiredUpdate UpdateClassification:"Security Updates" TimeGenerated>NOW-25HOURS | measure count() by Computer} | measure count() by Source   Generally speaking, you want your inner query to execute fast – we have some service-side timeouts for it – and to return a small amount of results. If the inner query returns more results, we will truncate the result list, which will cause also the outer search to potentially give incorrect results. Another rule is that the inner search currently needs to provide ‘aggregated’ results – or in other words, it must contain a ‘measure’ command; you cannot currently feed ‘raw’ results into an outer search. Also, there can be only ONE ‘IN’ operator (and it must be the last filter in the query), and multiple IN operators cannot be OR’d – this essentially prevents running multiple subsearches: the bottom line is that only one sub/inner search is possible for each ‘outer’ search.   Even with these limits, this addition enables all new kinds of correlated searches, and allows you to define something similar to ‘groups’ (of computers, or users, of files – whatever the fields in your data contain). Some more examples:   All updates missing from machines where Automatic Update setting is disabled Type=RequiredUpdate Computer IN {Type=UpdateAgent AutomaticUpdateEnabled!=Enabled | Measure count() by Computer} | Measure count() by KBID All error events from machines running SQL Server (=where SQL Assessment has run) Type=Event EventLevelName=error Computer IN {Type=SQLAssessmentRecommendation | measure count() by Computer} All security events from machines that are Domain Controllers (=where AD Assessment has run) Type=SecurityEvent Computer IN { Type=ADAssessmentRecommendation | measure count() by Computer } Which other accounts have logged on to the same computers where account BACONLAND\jochan has logged on? Type=SecurityEvent EventID=4624   Account!="BACONLAND\\jochan" Computer IN { Type=SecurityEvent EventID=4624   Account="BACONLAND\\jochan" | measure count() by Computer } | measure count() by Account   I am sure you will come up with other interesting search scenarios now. Happy searching!

Posted by on 30 May 2015 | 8:47 am

Quoi de neuf sur Power BI ?

A l’instar de Microsoft HDInsight, Microsoft Power Bi n’échappe pas à la dynamique de constante évolution qui touche les services du Cloud Azure et qui vise à proposer un service toujours plus pertinent et performant. A travers ce billet, nous vous proposons de revenir sur les annonces faites concernant les évolutions importantes dont le service Power BI a fait l’objet, à commencer par celle du 27 janvier dernier et suivantes. Il s’agit d’évoquer dans ce contexte la capacité de Power BI à se connecter à de nombreux autres services, ses nouveaux outils ou encore les apps qui permettant d’accéder depuis n’importe où à ce service. Power BI apporte  une transformation au niveau des applications du décisionnel (Business Intelligence ou BI). Un bref rappel sur Power BI Comme nous avons déjà pu l’illustrer au travers de nombreux billets sur ce blog, Microsoft Power BI est un service cloud qui fonctionne de concert avec Microsoft Excel pour fournir une solution complète d'analyse de données en libre-service (Software-as-a-Service oblige ;-)). Le nouveau site Power BI actuellement en version préliminaire est accessible ici : Signez-vous avec votre compte ou si vous n’en avez pas, précisez simplement votre adresse mèl professionnelle et commencer à utiliser l’environnement :-) Avec d’un côté Excel qui permet de réaliser des rapports, et de l’autre Power BI qui permet de les partager, vous disposez ainsi d’un outillage puissant vous permettant de travailler avec toutes vos données. Nous allons y revenir en termes de connexion. Vous pouvez combiner, modéliser, analyser et visualiser vos données sur Excel comme jamais vous ne l’auriez fait auparavant. Dans ce contexte, Power BI permet de mettre en place une galerie en ligne à destination de ses utilisateurs, facilitant votre collaboration et vous donnant accès aux divers rapports et tableaux de bord depuis n’importe quel appareil. Il vous permet également d’obtenir des réponses et des visualisations de données en direct. Si vous souhaitez vous familiariser avec Power BI, nous vous recommandons de suivre les tutoriels et webinaires qui vous permettront de prendre en main cet environnement et d’y effectuer vos premiers pas. Dans la pratique, vous serez amené à utiliser plusieurs composants d’Excel et Power BI : Power Query pour Excel: permet d’importer vos données depuis différentes sources, de les retraiter (réaliser des filtres, remplacer des valeurs manquantes, etc.) et de les enrichir. Power Pivot : permet de modéliser et d’analyser vos données. Power View : pour explorer, visualiser et présenter vos données Power Map pour Excel : permet de représenter vos données sur des cartes Bing en 3D, de visualiser celles-ci dans le temps et l’espace, d’en faire des captures et de les partager avec d’autres utilisateurs. Sites Power BI dans Office 365 : permet de transformer un site SharePoint Online en une version plus robuste et dynamique, pour partager des rapports Excel avec un aspect plus visuel et adapté à la Business Intelligence. Power BI Q&A : vous offre la possibilité d’obtenir directement des informations sur ses données en utilisant un langage de requête intuitif. ·Centre d’administration Power BI : permet aux administrateurs informatiques d’exposer des données locales à partir de flux de données OData et aux utilisateurs de rafraichir leurs classeurs Excel sur SharePoint Online avec des données locales. Les rappels ayant étés ainsi faits, il est temps de passer aux nouveautés qui font l’objet de ce billet. La nouvelle version de Power BI est disponible en version préliminaire comme indiqué ci-avant et correspond aux captures d’écran précédentes. Vous trouverez ici une vidéo d’introduction à cette nouvelle version. Une version offrant de nouvelles possibilités Premier constat, l’inscription est plus simple et plus rapide puisqu’elle ne requiert qu’une adresse mèl professionnelle comme nous l’avons souligné précédemment. Vous pouvez personnaliser vos tableaux de bords qui regroupent sur une même fenêtre à la fois vos données locales mais aussi celles hébergées sur le Cloud. Vous créez à votre guise des graphes ou autres représentations graphiques de vos données, vous permettant de visualiser en continu l’état de leur business. A ce sujet, de nouvelles formes de visualisation ont été introduites comme les diagrammes combinés, les « filled maps », les gauges graphiques, « tree maps » et graphiques en entonnoir. Power BI propose un ensemble de connecteurs intégrés vous permettant de vous connecter en quelques minutes seulement aux applications SaaS les plus populaires auxquelles vous avez souscrites. Avec un simple abonnement à l’un de ces services, il vous est possible de vous connecter depuis Power Bi à Acumatica, GitHub, Google Analytics, Marketo, Microsoft Azure SQL Database, Microsoft Azure SQL Data Warehouse, Microsoft Dynamics CRM, Microsoft Dynamics Marketing, Salesforce, SendGrid, Twilio, Visual Studio Application Insights, Visual Studio Online ou encore Zendesk. D’autres sont attendus dans les prochains mois incluant Inkling Markets, Intuit, Sage, Sumo Logic,. A titre d’illustration, nous vous invitons d’ailleurs consulter cet article qui montre comment tirer profit des données contenues dans votre compte Marketo – si vous en avez un - à l’aide de Power BI. Dans le même temps, Power BI intègre désormais un connecteur aux services SQL Server Analysis, vous permettant de bénéficier d’une solution basée sur le Cloud sans avoir à y migrer vos données. Vous pouvez de fait à présent créer depuis Power BI une connexion sécurisée à un serveur local disposant des services SQL Server Analysis. Lorsque vous explorerez vos tableaux de bords et rapports, Power BI utilisera les identifiants spécifiés pour exécuter des requêtes sur votre modèle en local. Si vous souhaitez vous lancer, le billet de notre collègue Franck Mercier vous détaillera comment créer un Dashboard en temps réel avec Power BI et Azure Stream Analytics. Power BI Designer, un nouvel outil Toujours dans cette quête de faciliter l’utilisation de ce service, la version préliminaire de Power BI présente un nouvel outil : Power BI Designer. Il s’agit d’une solution permettant de construire de bout en bout des analyses sur (la version préliminaire) de Power BI. Cette dernière aujourd’hui en version préliminaire vous donne la capacité de vous connecter rapidement à toutes vos données (qu’il s’agisse de feuilles Excel, de données locales, de jeux de données Hadoop ou stockées sur le Cloud Azure), de les façonner selon vos besoins, de les visualiser et de partager vos résultats à travers Power BI, les rendant accessibles depuis n’importe où sur n’importe quel appareil. Vous pouvez le télécharger ici. Version préliminaire oblige, cet outil connait des améliorations en continue et de fait a connu pas moins de 6 mises à jour et améliorations en février dernier, 7 en mars, 16 en avril, 8 en mai. Difficile de toutes les lister ici ! Il est donc préférable de s’en tenir aux différents liens précédents qui vous donneront un aperçu de toutes ces évolutions. Des apps pour votre mobile ou tablette Toujours dans une volonté de rendre mobile l’accès aux divers rapports et tableaux de bords créés dans Power BI, Microsoft souhaite élargir son offre Power BI sur mobiles. L’application Power Bi pour IOS est déjà disponible sur l’Apple Store. Il en est de même pour Power BI pour Windows (Windows 8.1 et Windows 10). Elles seront bientôt suivies par une application similaire pour Android. Ces applications rendent dorénavant possible de modifier vos graphes sur Power BI, d’explorer de nouvelles données et de partager avec d’autres depuis un téléphone ou une tablette. Autre nouveauté que nous souhaitons souligner : la capacité de créer des alertes sur vos propres données pour recevoir des notifications si quelque chose d’intéressant arriverait. Ceci conclut ce rapide billet sur Power BI. Compte tenu de l’actualité continue que connait Power BI, nous vous invitons à aller lire ou relire les billets publiés sur le Blog Power BI du groupe produit. N’oubliez pas non plus le Forum MSDN Power BI qui est là pour faciliter la création d’une communauté d’échanges sur le sujet.

Posted by on 30 May 2015 | 4:22 am

Create your own Tower Defense XII, Adding the Score

Today I am sharing with you another video from my Twitch channel http://www.twitch.tv/hielo777/ with my series GameDev Adventures. This video is the twelfth part that will help you to understand the different aspects of a classic Tower Defense game and how to implement them in Construct 2. Here you can find the links for the previous parts:  Create your own Tower Defense I                                      Create your own Tower Defense II Create your own Tower Defense III                                      Create your own Tower Defense IV  GameDev Adventures: Tower defense Part 5, Creep Waves GameDev Adventures: Tower defense Part 6, Creep Waves II GameDev Adventures: Tower defense Part 7, Creep Waves III GameDev Adventures: Tower defense Part 8, Creep Waves IV GameDev Adventures: Tower defense Part 9, Waves V Create your own Tower Defense X, Creep Animations Create your own Tower Defense XI, Health Bars  All the videos can also be found in my YouTube Channel http://bit.ly/hielotube even after they have been deleted from Twitch.  Download the Source File for this video Tutorial!!!!   The tutorial explains how to add individual health bars for each enemy and how to make them more challenging. 

Posted by on 29 May 2015 | 11:46 pm

[Sample Of May. 30] How to use HttpClient to post Json data to WebService in Windows Store apps

May. 30 Sample : https://code.msdn.microsoft.com//How-to-use-HttpClient-to-b9289836 ​The sample demonstrates how to use the HttpClient and DataContractJsonSerializer class to post JSON data to a web service. It's easy to achieve this in WinJS realm. But there is no example shows how to do this using HttpClient for the .NET applications. You can find more code samples that demonstrate the most typical programming scenarios by using Microsoft All-In-One Code Framework...(read more)

Posted by on 29 May 2015 | 9:21 pm

AX 2012 - Envío del CFDI por correo electrónico

Marco Legal De acuerdo al CÓDIGO FISCAL DE LA FEDERACIÓN en su Artículo 29, se prevé que, los contribuyentes deberán entregar o poner a disposición de sus clientes el archivo electrónico del comprobante fiscal digital por Internet y, cuando les sea solicitada por el cliente, su representación impresa, por lo que se considera necesario que el contribuyente que solicita el comprobante fiscal proporcione los datos de identificación para generarlo. Fuente: SAT en la siguiente liga Criterio no vinculativo sobre expedición de Factura Electrónica     Configuración de la Factura electrónica En AX para este fin, la configuración para:  a.   Enviar por correo el archivo electrónico al cliente, es necesario marcar el campo ‘Enviar correo’.  b.   Enviar por correo la representación impresa al cliente, es necesario marcar el campo ‘Enviar archivo de informe: PDF:’.   Ambos campos, se encuentran en la siguiente ruta: Administración de la Organización > Configurar > Factura electrónica > Parámetros de factura electrónica > CFDI.       Configuración de la Gestión de impresiones Adicionalmente, para enviar por correo la representación impresa al cliente, se debe realizar la siguiente configuración: a. Cuentas por cobrar > Configurar > Formularios > Configuración de formulario > Botón Gestión de impresión b. Seleccionar el documento requerido para enviar el PDF. Por ejemplo: Factura de servicios c. En el panel de la derecha, dar botón derecho sobre la lista de Destino > Configurar impresora   d. Seleccionar Archivo en la lista e. En nombre de archivo colocar la carpeta y nombre Factura.pdf f. En el campo Formato, seleccionar PDF   g. Guardar y cerrar Resultado Como resultado, se tendrá el envío de la representación impresa del CFDI por correo electrónico al cliente.         Nota. Cada archivo se envía en un correo electrónico por separado.       Para C 

Posted by on 29 May 2015 | 7:42 pm

The end of Version Control Guidance? A pinch of nostalgia and future plans.

A bit of nostalgia The Version Control Guidance adventure was the first project I engaged with the ALM Rangers, then known as the VSTS Rangers, working on the TFS Branching Guide. It has grown into a blockbuster that not only shared insightful and practical guidance around branching, merging and other source control concepts, but inspired our German, Spanish and Japanese communities to localise the guidance over 9 years.   Extracts from the Japanese edition in 2010. Releases The download stats on CodePlex from January 2010 show healthy growth, sustained support and spikes of excitement, especially when we introduced the crisper format. 2010 1.0 1.0.1 1.0.2 1.0.3 Upgrade and retired TFS Branching and Merging Guide. The Version Control Guidance is born. Japanese localized edition. German localized edition. Spanish localized edition. 2012 2.0 Visual Studio 2012 focused guidance. 2014 3.0 3.0.1 Introduction of a crisper style and focused on Strategies, TFVC Gems and NuGet guidance. Spanish localised edition. 2015 3.1 3.2 Revision of Strategies, TFVC Gems and NuGet guidance. Introduction of the fourth guide Git for the TFVC User, complemented by the Git for the TFVC User call for feedback blog series. Future Nothing describes the future of the Version Control Guidance better than the infamous quote from Winston Churchill: “Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.” Two of our strategies influencing the next Version Control Guidance are: Simplicity … practical, crisp guidance, reducing waste and increasing user value. Openness … open source and allow community contributions. We are not planning a revision of the Version Control Guidance on CodePlex, moving the solution from active to service mode on our Library of tooling and guidance solutions (aka.ms/vsarsolutions) page. Instead we have been collaborating with Matt Mitrik and his team, exploring a three-pronged strategy: Mainstream Documentation - Deliver crisp, markdown based, guidance which is “merged” (excuse the pun) with the new Version Control documentation on …vs/alm/Code/overview. You will be able to peruse the professional product documentation and practical guidance “as one”, in a discoverable and consistent style. Out-of-Band BLOG Series - Deliver dogfooding, research and from-the-field experiences on our …/visualstudioalmrangers/ blog. We will continue to share our experiences and explore concepts that may migrate to the mainstream documentation over time. OSS - Continue investigating the Git-based Pull Request concept on Visual Studio Online and GitHub, to enable the ALM Community to contribute to the guidance. We need your help! Which sections of the Strategies, TFVC Gems, NuGet and Git for TFVC User guides do you want us to retain, maintain, enhance and what you see as mainstream or out-of-band? Please add your candid thoughts below of ping us on email.

Posted by on 29 May 2015 | 7:23 pm

Another great community day with colleagues at the Sharing Farm

We enjoyed another great day at the http://www.sharingfarm.ca/, pulling weeds,  trimming grass and many other back breaking activities Here are a few pictures to demonstrate the beauty and phenomenal weather we enjoyed: These volunteer days are not only for a good cause, but really great for team building and always fun!

Posted by on 29 May 2015 | 7:21 pm

Issues With Visual Studio Online Cloud Load Test - 5/29 - Resolved

Final Update: Friday, 30 May 2015 01:33 UTC Our DevOps team has rolled out a fix for the load test runs stuck in the queued state issue. Customers should not experience any further issues. We understand that customers rely on VS Online as a critical service and apologize for any impact this incident caused. Sincerely, VS Online Service Delivery Team ----------------------------------  Initial Update: Friday 5/29/2015 23:45 UTC We have identified an issue with Cloud-based Load Testing Service where customer load test runs will be stuck in the queued state. Our DevOps are actively engaged & investigating the issue. We will post an update as we have more information. We apologize for the inconvenience. Sincerely VS Online Service Delivery team  

Posted by on 29 May 2015 | 6:48 pm

Excel hangs while publishing/uploading data using the Excel Add-in for Dynamics AX

  In a scenario, while trying to publish data/records using Excel add-in for Dynamics AX, issues like performance, hang/not responding, time out have been observed. For Example, once clicked on ‘Publish All’ or ‘Publish Selected’ Excel goes to ‘not responding’ stage. This behavior gives an impression that the Publish action was not successful or it is abruptly hung. Cause: Behind the scenes, Excel is designed to automatically perform the ‘Refresh all’ action after completing ‘Publish All’ or ‘Publish Selected’ task. Ideally, once the data from Excel is published, it is to amend that data in the database and then the Excel spins off a refresh action which fetches all the records (old and new) on Excel for the User’s visibility. The larger the amount of data, the longer it takes for Excel to come back to life. It is the ‘Refresh’ action which consumes additional time and resources. This leads to performance, Excel not responding/hang, time-out issues on Excel. Resolution: Shorten the time consumed by the Auto-Refresh action in Excel add-in. In order to do this: Before doing the ‘Publish All’ or ‘Publish Selected’ task, use the built-in Filter functionality on Excel add-in.            Add a random filter criteria for any field e.g. ‘ZYX’ for ‘Journal batch number’ field. When Excel does the Refresh post publishing, this filter criteria will notreturn any results, as the value ZYX does not exists for this field in the database.           Hence in an environment where there is lot of data, it will certainly make sense to add a random filter (as in the above screen capture) as it will reduce the refresh time and Excel will still be active.

Posted by on 29 May 2015 | 5:30 pm

What's the direct URL for a Windows Update driver?

My day job is all about GPU drivers and I use all sorts of quirky hacks for various diagnostic purposes. I found the new Windows Update ETL log file process cumbersome when I just needed to know the driver download URL. So here is a quick hack to determine the exact URL of the latest available Windows Update driver. With this hack, I can share the URL of the exact driver I'm using with other folks with zero confusion.   First stop the Windows Update service from an admin command-prompt: net stop wuauserv Then delete all the old logs and downloaded files (backup first!): del /s /q %windir%\SoftwareDistribution\*.* del /s /q %windir%\Logs\Windows Update\*.etl Uninstall current driver: Start - device manager Open Display adapters Right-click on the device - for example the AMD Radeon R9 Click Uninstall At the prompt, ensure that "Delete the driver software for this device" is checked. Click OK Wait until the driver is uninstalled On the Action menu click "scan for hardware changes" The device now comes up with a Yellow-mark over it indicating a driver could not be found. If an older driver was found and installed, you can uninstall that too and repeat until the only thing left is the Microsoft Basic display adapter (MSBDA). Start Windows Update service from admin command-prompt: net start wuauserv Install latest available Windows Update driver: Start - device manager Right-click on the device and click "Update Driver software" Click "Search automatically for updated driver software" Wait for driver update to complete Open logs: Open %windir%\logs\WindowsUpdate Open the .etl file in notepad (if there is more than one log, open the larger one) Search for ". c a b" and find the last match in the file For example, in my .etl file the last match was: "h t t p : / / f g . d s . d o w n l o a d . w i n d o w s u p d a t e . c o m / d / m s d o w n l o a d / u p d a t e / d r i v e r / d r v s / 2 0 1 5 / 0 5 / 2 0 0 0 0 5 5 9 1 _ c d f 3 4 8 e 7 9 4 e f 7 d f c 0 c 4 a c 4 3 e a 7 e 4 5 e e f a a 2 7 2 2 4 b . c a b" Reveal the URL: Remove the spaces from the string to get the URL: http://fg.ds.download.windowsupdate.com/d/msdownload/update/driver/drvs/2015/05/200005591_cdf348e794ef7dfc0c4ac43ea7e45eefaa27224b.cab Remove the regional datacenter prefix "fg-ds": http://download.windowsupdate.com/d/msdownload/update/driver/drvs/2015/05/200005591_cdf348e794ef7dfc0c4ac43ea7e45eefaa27224b.cab This is the direct download link to the 15.200.1023.0003 AMD WDDMv2/DirectX12 driver for Windows 10 x64 that will work for just about anybody in the world. References: https://support.microsoft.com/en-us/kb/3036646   Back to the main blog: http://aka.ms/danchar

Posted by on 29 May 2015 | 5:04 pm

MSP Buzz: Programming for Beginners

We’ve asked a couple of our MSPs to answer some questions around how to get started in programming, including developing for Windows platforms, beginner web development, beginner cloud development, and more. They’ve even shared some of their personal stories and favorite resources to help you get started. And remember, for more information about the MSP Program see our website.  Our MSPs answering questions are: I want to build a Windows Phone app how do I get started? DT: There are a numerous resources out there at your disposal. One of my personal favorites is Microsoft Virtual Academy (MVA). And all you need is a Microsoft account and you are good to go! MVA has a variety of video tutorials in a variety of categories including "For Beginners", "HTML5", and "Windows Phone App Development". These tutorials have video,transcripts and exercise files so you are completely immersed in the experience. And the best part is, you can watch them in parts! You do not have to give up your entire day to learn app development, you can pace yourself. I highly recommend checking out these tutorials to get a grasp on the concepts. But in order to become a really skilled app developer, you need to create an app. First off you need an idea, if you have an idea, that's great! But like most of us, you may not know where to start. In that case, just copy an idea - create your own version of flappy bird, or a YouTube player for your favorite YouTube channel (this is how I learned :). Once you have your idea, you need to give yourself deadlines - "By the end of the day today, I want to have this button doing this function". These little deadlines are key to learning a new technology. It forces you to do what you may not know - therefore forcing you to learn (kind of like chooling!) TC: Personally, I'm a fan of MVA. They have a lot of great videos for both beginners and more advanced individuals looking to improve their developing skills and know-how. The Windows Dev Center also has a lot to offer in terms of tutorials and guides! I want to learn more about programming on my free time, how do I get started? DT: Programming is a skill that people are going to need to know in the future. I applaud you for taking some of your valuable free time to learn the art (science?) of programming! At first it can be very daunting - especially when you just peruse the Internet and see giant blocks of code that mean less than nothing to you. Where to begin - I recommend you checkout MVAs "For Beginners" section as wells Codecademy. Codecademy is one of my personal favorites, it provides a simple user interface for new users, and provides tutorials on some of the hosted languages used today! The best part is that you do not need to download anything! You do it all on your browser and they even keep track of your progress through particular course and give you achievements! Once you feel comfortable in a particular language, try challenging yourself a bit, and give yourself homework on a topic of the language that you do not understand. Having deadlines is a great way to increase productivity and to solidify what you learn. Just sitting at home and coding for 10 minutes a day might not do much for you if you do not have a goal in mind or a deadline to meet. TC: This is a tough one because there are so many ways to get into programming, but I'm going to recommend the "Introduction to Programming with Python" course on MVA. After you have a basic grasp of what programming is, you can dig into tougher languages such as C# or C++. I'm curious about developing for Azure, how do I get started?   DT: For Azure development, I highly recommend checking out MVA. Being a Microsoft platform, you will not be able to find any better tutorial than you will at MVA! Their step by step tutorials complete with video and exercise files are the perfect way to learn Azure development. TC: Right now, there's a free one-month trial that gives you $200 to spend on Azure. If you know you want to develop on a cloud platform, this is a fantastic opportunity to take advantage of. If you're unsure as to whether you actually want to develop a cloud service, the overview page on their website has a great list of various services that you can provide by developing on Azure. I'm interested in the Internet of Things, how can I get started connecting devices? DT: The Internet of Things (IoT) is become more of a hot topic every day! With the release of smart cars, watches, houses, and toasters, the possibilities for interconnecting random devices is endless. Maybe you want to program your shower to turn on as soon as you enter your bathroom – this is entirely possible (but how do you do it?). First off, you need some sort of computer to control whatever it is you want to control. I highly recommend checking out the Arduino or the RaspberryPi. These are two of the most well support mini computers that you can use to just about anything – both run on their own derivative of Linux but RaspberryPi and Microsoft announced that Windows 10 will be able to run on the newest version J. The best part of these computers is that they are cheap! Once you have your machine of choice, you will need to learn a little bit about networking and electrical engineering to configure it to a device. For that I direct you to the Internet – The Arduino site and RaspberryPi site have blogs about new things you can do with their products. You can peruse these sites to get inspiration or to mimic and idea. Once you comfortable, the possibilities are endless. TC: I'd first look into what device you want to work with - if you're really eager to get something connected to the internet, I recommend the Spark Core. It has a more powerful processor than an Arduino and it has a wireless connection on its board, so you don't need to worry about purchasing separate adapters. Once you have a Spark Core, they have a great "Getting Started" page (to help you jump right into the action!) How do I build a simple mobile app that is cross-platform?       DT: For building a cross platform application, there are 5 really good tools that allow you to do this. These tools allow you to write your app in one language or two (Java, Objective-C/Swift, HTML/JavaScript, C#). Once you have created your app,the tool compiles and builds your code and creates different executables for each platform you want to release it on. Each tool is different in how they want the initial app to be developed and which platforms you can release on. I highly recommend you check you this link for all the pros and cons for the top 5 cross platform app development tools. Getting started developing cross-platform apps is no different from developing regular apps for a single platform, I recommend you checkout my answer to "How do I get started on Windows Phone app development". The process is more or less the same for most platforms - just syntactically different. TC: Windows Universal apps. While these already exist in Windows 8.1, Windows 10 is going to make this process even better for developers. The Windows Dev Center and MVA both have in-depth walkthroughs and tutorials on the subject, but the idea behind it is this: You have a little bit of platform-specific code you write for two types of platforms (say, phone and tablet), and the rest of your program only has to be written once. This is a huge timesaver, as well as a great way to reach a larger audience. I want to build a simple computer game, how do I get started? DT: Building a computer game is something a lot of programmers really want to do (including me), and the issue with it is a lot of people want to start off by building games that rival triple A titles. And that is simply not feasible. In order to get start, you need to start small - with a clone of Flappy Bird or Doodle Jump). There are a number of friendly libraries that help ease the user into game programming. One of my personal favorites is PyGame - a game programming library for Python. Python is a very user friend language - no nasty syntax or typing. PyGame gives the programmer complete control over everything - the difficulty lies in learning in syntax. Now if you want to step away from programming and want a simple user interface, I would recommend the Construct 2 game engine - no coding is required. There are plenty of tutorials to get you familiar with the layout and the functionality of the engine. And once you are comfortable, the possibilities are endless. The best part is, when you are done with your game, you can export it in a form that would allow you to upload it to the Windows App Store! If you want a little more advanced (maybe 3D) gaming engine - you can checkout Unity. I don't have too much familiarity with Unity3d but I do know that is very robust and there are tons of wonderful tutorials out there on the Internet. TC: There are a lot of great tools out there for game development. For those with little or no coding experience, I recommend Construct 2 - it's a great suite for making 2D games that can help you get a prototype up and running in just a few hours. If you're a little more confident in your ability to write code, MonoGame is a fantastic successor to Microsoft's XNA framework! Apply to be an MSP by July 15th, 2015. For more information about the MSP Program see our website.      

Posted by on 29 May 2015 | 4:26 pm

Announcing Octopus Deploy integration for Visual Studio Online

Visual Studio Online recently  announced a new build system that offers a heterogonous capable platform that can easily build or integrate almost any existing asset and be extended to add new ones.  (For more information about this announcement please see Chris Build session: http://channel9.msdn.com/Events/Ignite/2015/BRK3726 ) One of the Architects at Octopus (and “fellow” Queenslander) , Damian Brady decided to take advantage of this opportunity to create an Octopus Integration offering for Visual Studio Online. Using his own words:   “The idea was to be able to create a new Octopus Deploy Release as a separate step in your build definition.  I'm pretty excited by how powerful it was.   After spending some time with it, I want to say how awesome the new Build system is!  It's so much easier and so much more powerful that the legacy stuff. I'm really, really impressed.”   I have included much of Damian’s blog post below on this exciting new offering below but highly recommend you check out his blog for the whole story: http://octopusdeploy.com/blog/octopus-integration-with-tfs-build-vnext  Octopus integration with TFS Build vNext   The new structure of Team Build gives us a great opportunity to integrate better with your build process. To that end, we've created a new, public OctoTFS repository in GitHub. It currently contains two options for creating a Release in Octopus as an independent step in your build definition. Both of them let you separate the build and deploy phases really nicely. Note that you'll still have to package and push your Nuget package to the Octopus Server (or another Nuget repo) - these steps just create releases. You can still use OctoPack for packaging and pushing. The integration I'm most excited about is the custom Build Step. It gives you a really nice UI and even includes release notes pulled from changesets and work items - something we get asked for a lot. Unfortunately, because you need to upload the package to TFS/VSO, it won't be available to use until the new build system hits RTM. That shouldn't be too far away. At least this way you'll be able to use it from day one of RTM instead of having to wait! The other option is a PowerShell script you can include in your project. This one you can use right now, and it works nearly as well (no release notes yet). It's not quite as nice to work with, but it does the job for now. Support We will continue to work on these integrations so they're useful and easy to use for as many people as possible. Our priority is always going to be on the core product though, so we'll improve and add when we can. Of course the OctoTFS repository is open source, and we will be accepting pull requests, so if you see a bug, a potential improvement, or even a completely new integration option, we'd love your contribution!   About Damian Brady Who am I? I’m a Brisbane-based developer, trainer, and author specialising in Agile process management, software craftsmanship and software development. I have a love of Octopus Deploy, Team Foundation Server, Scrum, C#, Nancy FX, ASP.NET MVC, HTML5, JavaScript, and web development in general. What do I do? I’m a Microsoft MVP in Application Lifecycle Management and I work as a Solution Architect forOctopus Deploy. I’m also a co-author of Professional Team Foundation Server 2013 from Wiley. I run the Brisbane .Net User Group, and the annual DeveloperDeveloperDeveloper Brisbaneconference. I spend a lot of time training teams on how to improve their software, be it through improving their agile process, devops, or code quality. I regularly speak at conferences, User Groups, and other events, but most of the time you’ll find me working on Octopus Deploy, or helping teams get the most out of their devops strategies. I write software mainly for the web using Nancy or ASP.NET MVC combined with healthy amounts of JavaScript and Angular, and I usually cut code in Visual Studio 2013 and manage my work with Visual Studio Online orGitHub. However, I’ve gained experience in many other languages and environments. Contact Me You’re welcome to contact me: by email on  info@damianbrady.com.au.

Posted by on 29 May 2015 | 4:11 pm

Code running against Outlook is very slow when PST or OST is on a network folder or non-physical/non-VHD drive.

There is no support for performance issues with Outlook where the PST or OST are on a network folder or non-physical/non-VHD drive.  See below: Limits to using personal folders (.pst) files over LAN and WAN links https://support.microsoft.com/en-us/kb/297019 Note Customers are responsible for both defining and maintaining adequate network and disk I/O. Microsoft will not assist in troubleshooting slow performance due to networked .pst or .ost files. Microsoft will only assist if the performance issue is reproduced while the .pst or .ost file is located on either a hard disk that is physically attached to the computer that is running Outlook, or on a virtual hard disk (VHD) that is attached to the virtual machine that is running Outlook.ImportantMicrosoft programs may not work as expected in a third-party application or software virtualization environment. We do not test Microsoft products that are running in third-party application or software virtualization environments. For more information about support provided by Microsoft for its software running together with non-Microsoft hardware virtualization software, click the following article number to view the article in the Microsoft Knowledge Base: 897615 Support policy for Microsoft software running in non-Microsoft hardware virtualization software So, keep the above in mind when writing code which goes directly against Outlook - such as OOM or Extended MAPI code.  If Outlook is performing poorly then the execution of code will also be poor.  OST and PST files are for Outlooks usage and there is an immense amount of access against them - in essence they are extreme heavy usage Outlook database files.  So, its critical to follow Microsoft guidelines on where its appropriate for them to exist.  Over more than a decade I've seen customers trying to get decent performance and some do get performance which seems viable, however they often run into issues - especially when they mix in VPN access and come to a grinding halt - yeah, I would expect so.  Another situation I've seen a number of times is where a customer is running into perf issues on virtualized servers and want to know why one has slower perf than the other - well, that's something to ask the virtualization vendor.  Often such virtual servers are running different loads of applications with different loads of users - so, yes performance can and will vary. If you really wish to test performance with code accessing Outlook then you should have Outlook and its OST & PSTs on a physical or .VHD drive.  I've done a lot of testing with different types of code going against Outlook and this is the only way to get quality results.  Also see: Does "Unsupported" Mean?http://blogs.msdn.com/b/pcreehan/archive/2007/05/04/what-does-unsupported-mean.aspx    

Posted by on 29 May 2015 | 3:46 pm

Free Course (Video Available): Practical Performance Tips to Make Your HTML/JavaScript Faster

      Course Description: Want to maximize the performance of your modern websites and apps with JavaScript and HTML5? This course Practical Performance Tips to Make Your HTML/JavaScript Fast has the practical strategies, tips, and tricks you need, along with helpful demos and best practice guidelines. Watch this team of experts for a detailed look at how to write fast JavaScript. Explore the fundamentals of web performance, tools for monitoring and measuring JavaScript/HTML speed...(read more)

Posted by on 29 May 2015 | 3:44 pm

Encore un autre "nouveau" venu :-) Bienvenue à Azure Stream Analytics !

Au fil du temps, les perceptions de l'industrie ont changé par le biais d’un meilleur usage des données et avec cette transformation digitale, Microsoft en a fait de même afin de fournir des solutions d'analyse de données toujours plus simples d’appréhension et rentables au regard de l’investissement demandé. Aujourd'hui, de nombreux processus s’exécutent sur Internet où des tonnes de données d'évènement sont générés à un rythme soutenu et sont ainsi disponibles (potentiellement) aux entreprises. Les entreprises qui sont capables de traiter et de prendre des décisions en temps réel sur cette base sont plus flexibles et donc plus à même de se différencier sur leur marché et de “tirer leur épingle du jeu”. L’analyse en temps réel procure un avantage dans tous les secteurs qu’il s’agisse de l’analyse du cours d’une action pour le trading, d’alertes sur les risques financiers, de la détection de fraudes, de la protection des données, de l’analyse de données émises par des capteurs ou encore de flux de données provenant de sites web. Il s’avère donc très important pour ne pas dire clé de pouvoir exploiter en temps réel de telles données au fil de l’eau, et pas juste en se contenter d’interroger seulement des données au repos dans une table de base de données. Evoluant dans un environnement hautement compétitif, les entreprises cherchent la solution leur permettant de réaliser ces analyses en temps réel de manière flexible, sûre et à faible coût. Traditionnellement, si un service informatique se lance dans l’implémentation d’une telle solution, il faut généralement repartir de zéro. La séquence typique commence par l'achat de matériels dédiés, l’installation dudit matériel et des logiciels envisagés dans le contexte, la conception du code nécessaire, son déploiement, son test, sans oublier le contrôle et le suivi de la solution. Même la fonctionnalité SQL Server StreamInsight  impose un cycle de vie de solution similaire.  (Pour mémoire, SQL Server StreamInsight est une technologie de traitement d’évènements complexes (CEP) qui permet de créer des applications pilotées par des évènements et de mieux dériver des idées en corrélant des flux d'évènements à partir de plusieurs sources avec un niveau de latence quasi nul.) Par ailleurs, concevoir le code et la solution au final n’est pas chose aisée. La solution doit non seulement se montrer efficace au final vis-à-vis l’analyse et de ses résultats attendus mais également être résistante aux pannes. On comprend alors que les couts de mise en œuvre et de maintenance de ce genre de solutions augmentent rapidement. Les grandes entreprises se sont résignées à subir ces couts importants pour construire leurs propres solutions mais les petites, elles, passent souvent à côté de cette opportunité car elles ne peuvent suivre. Par contraste, Azure Stream Analytics vous permet de concevoir une solution en quelques minutes. Oui, en quelques minutes et ce sans se soucier de l'achat de nouveau(x) matériel(s), payer pour le développement de solution, leur déploiement et leur supervision. Stream Analytics est un service de calcul de flux en temps réel entièrement géré dans le Cloud permettant de traiter des évènements complexes (CEP) évolutifs de données de diffusion avec une latence faible, une haute disponibilité et de façon évolutive. Une vidéo d’introduction est disponible ici. Annoncé sous forme de version préliminaire publique le 29 octobre 2014 dernier lors de la conférence Microsoft TechEd Europe 2014, ce service de traitement et d’analyse de flux de données s est désormais en disponibilité générale depuis le 16 avril 2015 dernier. Pensez à la façon dont vous vous y prendriez pour compter toutes les voitures rouges dans une aire de stationnement avec une requête SQL ou le comptage de mots (word count) de Map/Reduce. Considérons maintenant le scénario de diffusion en continu équivalent. Comment est-ce que vous compteriez toutes les voitures rouges passant devant un point particulier sur la route et ce dans un délai de 1 minute. Grâce à Azure Stream Analytics, les développeurs peuvent facilement combiner des flux de données avec des enregistrements d'historiques ou référencer des données pour dégager des perspectives rapidement, en toute simplicité. Azure Stream Analytics dispose d’un modèle simple de déclaration de requêtes pour décrire les traitements à effectuer. Le langage de requête est proche du SQL et une gamme d’opérateurs allant de simples filtres à des corrélations complexes et des agrégations est fournie. La définition d'opérations fenêtrées basées sur le temps telles que les agrégats fenêtrés, la corrélation de plusieurs flux pour détecter des modèles (comme des séquences) ou la comparaison de conditions actives par rapport à un historique de valeurs et de modèles, peut s'effectuer en quelques minutes à l'aide de l'ensemble accessible des opérateurs de langage de requêtes SQL d’Azure Stream Analytics. Le langage proposé étend ainsi de manière intuitive la sémantique de SQL mais reste néanmoins dans ses limites syntaxiques. Il s’agit encore une fois de faciliter l’utilisation de ce nouvel outil. En termes de flux de données, Stream Analytics propose une intégration immédiate au concentrateur d'évènements Azure (Event Hub) afin de pouvoir ingérer des millions d'évènements par seconde. Ensemble, les concentrateurs d'évènements Azure et Azure Stream Analytics vous permettent de traiter de gros volumes de données issues de capteurs, d'appareils, d'applications, etc. et de prendre des décisions en temps réel. La plage d'entrée et de sortie des interfaces du concentrateur d'évènements Azure facilite l'intégration d'Azure Stream Analytics à d'autres sources de données et moteurs de traitement sans dénaturer la capacité de diffusion des calculs. Ceci apporte une souplesse certaine à Stream Analytics pour pouvoir prendre en charge de multiples scénarios d’analyse en (quasi-)temps réel ceux liées au scénarios de l'Internet des objets (IoT). Il suffit de regarder un instant le monde qui nous entoure ou de reprendre encore les illustrations proposées par la session plénière J3 Vers une technologie invisible et une intelligence omniprésente ? lors des Microsoft TechDays 2015 pour se convaincre de l’intérêt : les appareils et objets deviennent de plus en plus intelligents et aptes à communiquer leur état, des valeurs représentant avec leur environnement leur statut et au-delà. A ce propos, nous en profitons pour rappeler l’annonce faite en mars dernier lors de l’évènement Microsoft Convergence 2015 de la suite IoT de Microsoft qui viendra compléter la suite IoT Azure d’ores et déjà disponible et dont fait partie Azure Stream Analytics, au même titre que le concentrateur d’évènements Azure ou encore Azure HDInsight, DocumentDB, Microsoft Power BI dont nous sommes appelés à parler sur ce blog. Azure Stream Analytics réalise les transformations et autres traitements par le biais de jobs. Azure Stream Analytics se connecte par ailleurs également directement en termes de stockage à la base de données SQL Azure, aux blobs d’Azure de façon à accéder notamment aux données d'historique. Vous pouvez écrire des données à partir de Stream Analytics vers le stockage Azure, où elles peuvent ensuite être traitées à nouveau comme une série d'évènements ou utilisées dans d'autres formats d'analyse par lots à l'aide d'Azure HDInsight, objet de nombreux billets dans ce blog. Azure Stream Analytics tire parti des années de travail de l'équipe Microsoft Research dans le développement de moteurs de diffusion hautement optimisés pour le traitement urgent, ainsi que dans les intégrations de langages pour de telles spécifications. Azure Stream Analytics tire parti de la communauté open source de Hadoop, YARN et REEF pour le traitement de la mise à l'échelle. Les sorties d’Azure Stream Analytics peuvent également être passées à un autre concentrateur d’évènements Azure. Elles peuvent enfin être utilisées à des fins de présentations. Le webcast de la session Gaining Real-Time IoT Insights using Azure Stream Analytics, AzureML and PowerBI jouée lors de l’évènement Build 2015 fin avril dernier met « en musique » Azure Stream Analytics pour vous donner un aperçu du champ des possibles. Pour plus d’informations sur Azure Stream Analytics, vous pouvez consultez la Présentation d'Azure Stream Analytics ainsi que la Documentation d'Azure Stream Analytics disponible sur le site Microsoft Azure. La Prise en main d'Azure Stream Analytics vous propose par ailleurs un didacticiel qui va vous expliquer comment utiliser des données de température en lisant des données à partir d'un concentrateur d'évènements Azure avant de traiter les données provenant des résultats d'une base de données SQL Azure. Vous ne disposez pas encore d’un abonnement Azure. Qu’à cela ne tienne, cliquez ici pour un environnement d’évaluation gratuit d’Azure. Nous vous invitons également à aller lire ou relire les billets publiés sur le Blog MSDN Azure Stream Analytics. N’oubliez pas non plus le Forum MSDN Azure Stream Analytics qui est là pour faciliter la création d’une communauté d’échanges sur le sujet. Enfin, pour vous mettre le « pied à l’étrier » au-delà du didacticiel précédent, nous vous recommandons de considérer le projet open source ConnectTheDots.io développé par Microsoft Open Technologies et disponible sur la forge communautaire GitHub. Ce projet a enfin été initié pour vous aider à connecter rapidement de petits appareils et capteurs (Raspberry PI, Intel Galileo, etc.) à l’environnement Microsoft Azure et à mettre en œuvre de belles solutions d'IoT tirant parti bien sûr d’Azure Stream Analytics, objet de ce billet, mais aussi, - vous vous en doutez compte tenu de ce qui précède – du connecteur d’évènements Azure, d’Azure HDInsight mais aussi d’Azure Machine Learning. Comme cela a été démontré lors des derniers Microsoft TechDays 2015, le Centre de l'accélérateur linéaire de Stanford ou SLAC (Stanford Linear Accelerator Center) met à profit ce projet ConnectTheDots.io dans un pilote des services de la suite IoT Azure mentionné précédemment. Ce pilote conduit avec Microsoft Open Technologies illustre le suivi et le contrôle prédictifs d'un des systèmes de refroidissement de l’accélérateur de particules du SLAC.

Posted by on 29 May 2015 | 2:04 pm

A fourth option for solving the problem of DMARC’s incompatibility with mailing lists – Part 3

We’ve looked at three options for solving the problem of mailing lists who have problems delivering email for domains that publish p=reject. None of the solutions are great. What else is there? 4. Play around with the From: address, or maybe even the Sender: and Reply-To: fields, to make it not fail DMARC Another way to avoid failing DMARC is to fiddle around with the original message so that when it is relayed, it doesn’t fail DMARC. One way is for the mailing list to set the Reply-To header with the original From address, and replace the original From address with the mailing lists’s address: SMTP MAIL FROM: tzink@myPersonalDomain.com DKIM-Signature: v=1; a=rsa-sha2; c=relaxed/relaxed; s=s2048;    d=myPersonalDomain.com   h=From:To:Subject:MIME-Version;   bh=<body hash #1>   b=<signature #1> From: Terry Zink <tzink@myPersonalDomain.com> To: Washington Magicians <washingtonMagicians@mailingList.org> Subject: Hi, I’m new here. Any good places to perform? And relay it like this: SMTP MAIL FROM: washingtonmagicians@mailinglist.com DKIM-Signature: v=1; a=rsa-sha2; c=relaxed/relaxed; s=mailer;   d=mailinglist.org   h=From:To:Subject:MIME-Version;   bh=<body hash #2>   b=<signature #2> From: Washington Magicians <washingtonMagicians@mailingList.org> To: Washington Magicians <washingtonMagicians@mailingList.org> Reply-To: Terry Zink <tzink@myPersonalDomain.com> Subject: [Washington Magicians] Hi, I’m new here. Any good places to perform? List-Subscribe: http://washingtonmagicians.mailinglist.org List-Post: mailto:washingtonmagicians@mailinglist.org List-ID: Washington Magicians The advantage of doing it this way is that the mailing list can modify the content to its heart’s content. In this example, it added headers (which is always fine), modified the Subject and appended a footer and DKIM signs the entire message. Because the DMARC check is done on the new From: address, and the new From: address aligns with the domain that passed SPF and DKIM, DMARC passes. But the new From: address is the mailing list, not the original sender. When anyone else replies to the list, the reply goes to the Reply-To field and not the From address. Or does it? It depends on what email client you are using. Microsoft Outlook may do it differently than the native iOS mail client which may do it different than Outlook for Mac. This can be annoying. Another participant wants to reply to the list and instead of replying to everyone on the list, they reply to (get it?) the original sender directly instead of the original sender and the rest of the list. If you hit Reply All, you’ll expect to everyone (original From + the To which is the mailing list). But, your mail client may just include the Reply-To in which case you have to add the list manually. But at least it passes DMARC. Another way is to fiddle with the Sender: header. This is pretty much the same as the above except that instead of putting the original sender in the Reply-To, they are put into the Sender field: SMTP MAIL FROM: washingtonmagicians@mailinglist.com DKIM-Signature: v=1; a=rsa-sha2; c=relaxed/relaxed; s=mailer;    d=mailinglist.org   h=From:To:Subject:MIME-Version;   bh=<body hash #2>   b=<signature #2> From: Washington Magicians <washingtonMagicians@mailingList.org> To: Washington Magicians <washingtonMagicians@mailingList.org> Sender: Terry Zink <tzink@myPersonalDomain.com> Subject: [Washington Magicians] Hi, I’m new here. Any good places to perform? List-Subscribe: http://washingtonmagicians.mailinglist.org List-Post: mailto:washingtonmagicians@mailinglist.org List-ID: Washington Magicians Once again, DMARC passes. And unlike the Reply-To trick, the reply does go to the From: address which is the entire mailing list. So that’s not bad. But most email clients other than Microsoft’s Outlook desktop client don’t even show you the Sender. For the above message, here’s how it looks: Email client Display in the reading pane Hotmail/outlook.com web client Washington Magicians Gmail web client Washington Magicians Outlook desktop Terry Zink on behalf of Washington Magicians This suffers from a lack of clarity when the user reads it. In the case of Hotmail/outlook.com or Gmail, it tricks the user into think it was “Washington Magicians" that sent the message, but it’s not; it’s me (Terry Zink). This isn’t shown anywhere in the list view or the reading pane in two common web mail providers. It looks like the message is From the discussion list (which is correct) but doesn’t show the true author of the message (which is me). For Outlook desktop, it does show the true author of the message but it has it backwards – it says me on behalf of Washington Magicians. That’s wrong, it’s actually Washington Magicians who are relaying the message on my behalf. If I sent the message, then I should be in the From: address. The intermediary is Sender:, not the From:. Some mailing lists even do the following: From: “'Terry Zink’ via Washington-Magicians” <washingtonMagicians@mailingList.org> X-Original-From: Terry Zink <tzink@myPersonalDomain.com> This is yet another way of playing around with the From: address. It shows who the original sender by displaying the Friendly From and that it was sent via the name of the mailing list, and pushes the original from address into the aptly named X-Original-From. The problem with this solution is that most mail clients could care less about the X-Original-From, and the formatting of the From address – at least in Outlook desktop – is incorrect. It shows only the name of the mailing list <washingtonMagicians@mailingList.org> in the Reading Pane (but the list view is correct). In other words, it confuses email clients. Sometimes it works, sometimes it doesn’t. So, while playing around with the From address works (i.e., gets around DMARC p=reject), the way the email is displayed to the end user is not all that clear. Either the reply-to-email experience is “off”, or the way the message is shown to the end user is off. Furthermore, everyone in the mailing list gets the same From: email address. Is that really what we want? Email clients aren’t all going to update do show the correct thing, Apple does something different than Microsoft who does something different in Mozilla Thunderbird who does something different than Google (in Gmail and the native Android email app). Are we going to get all of these email client providers to update their software and get users to update their versions? Almost definitely not. Finally, there are some people that are philosophically opposed to rewriting the From: address in any way (you know who you are). The message that was sent should have the same From: address that is relayed to the rest of the group. If you fall into that camp, then all of this From: tweaking won't work for you. From my point of view, From: rewriting is probably good enough in some cases but does suffer from the email rendering shortfalls. ======================= So, what else is there? Related articles in this series: Solving the problem of DMARC’s incompatibility with mailing lists – Part 1 Three options for solving the problem of DMARC’s incompatibility with mailing lists – Part 2 A fourth option for solving the problem of DMARC’s incompatibility with mailing lists – Part 3

Posted by on 29 May 2015 | 1:51 pm

How to register a node with a DSC pull server

In PowerShell 4.0, each node in a DSC configuration identifies itself to the pull server with its ConfigurationId, a GUID. In the case of pull mode (where a node downloads the configuration from a pull server), this ConfigurationId also maps to the name of the configuration document (the MOF file) stored on the pull server (you can read more about pull vs. push modes in this blog). As the configuration document’s name is a GUID and not a friendly name, users do not have the flexibility to share a single configuration across multiple nodes while still tracking the status and reporting information for each node individually. Fortunately, we introduced a new Registration feature in WMF 5.0 April Preview that separates the configuration ID into two distinct identifiers: ConfigurationNames and AgentId. ConfigurationNames identifies the configuration for a computer and can be shared by multiple nodes, while AgentId identifies a single, unique node. This allows the creation of configurations for Roles (multiple nodes that share the same configuration) rather than individual nodes. Because the ConfigurationNames are no longer GUIDs (they are now friendly names), anyone can determine them by brute force or guessing. Therefore, we added a new property called RegistrationKey that can be used to register a node with the pull server before a node can pull configurations from the pull server. A node registers itself with the pull server with a shared secret and specifies the name of the configuration it will be pulling in the metaconfig MOF. This shared secret does not have to be unique for each node as it is a hard-to-guess identifier, like a GUID. We specify the shared secret in the RegistrationKey property of the metaconfig. Below is an example:   The RegistrationKey value is already known to the server. When setting up the pull server endpoint, the administrator needs to provision its web.config for the registration keys that will be used to authorize pull clients during registration. This should be done by adding the “RegistrationKeyPath” setting to point to the location of the RegistrationKeys.txt file that contains the registration keys. This can be done the same way that you provision pull server configuration or module repository paths. Below is a snippet from the web.config and the highlighted section shows how to provision a pull server with registration key information: <add key="ConfigurationPath" value="C:\Program Files\WindowsPowerShell\DscService\Configuration" /> <add key="ModulePath" value="C:\Program Files\WindowsPowerShell\DscService\Modules" /> <add key="RegistrationKeyPath" value="C:\Program Files\WindowsPowerShell\DscService" /> Once the SampleRegistrationMetaConfig meta configuration is applied successfully, the node is registered with the pull server. During processing, a new AgentId will be generated for that node. The AgentId of the node can be retrieved using the Get-DSCLocalConfigurationManager cmdlet. Here is an example:              From this point, the node will be using the AgentId to communicate to the pull server instead of ConfigurationId. For more information on provisioning a pull server for the configuration with the name “WebRole” please refer to the “Push vs Pull” blog. Additionally, for more information on setting up a pull server endpoint using the xDSCWebService resource refer to: http://blogs.msdn.com/b/powershell/archive/2013/11/21/powershell-dsc-resource-for-configuring-pull-server-environment.aspx. Note that the Registration feature is not supported when the pull server is set up to be a file share. It is only supported for the web-based Pull server such as ConfigurationRepositoryWeb. Also, Registration feature is not currently supported for pulling partial configurations. Try using this new feature for downloading and applying configuration documents on the nodes using a friendly name and let us know what you think!   Narine Mossikyan Software Engineer Powershell and Desired State Configuration

Posted by on 29 May 2015 | 1:00 pm

Visual Studio Tip #9: You can edit directly in the Diff tool

Here is one I just discovered just last week by accident. When you are doing a diff on a file to compare what has changed from the source control, the diff window can be used to edit the file directly. (works on both Git and TFS source control) I always assumed the window was just a read only view, but on a whim I tried it and it works. This is nice because, in my workflow, I typically do a diff of my files before I check them in to ensure that I don’t have any unwanted edits still hanging around. But without this, I would have to do a diff, take a note of where the unwanted change is, go open the original file and make the edits, then do a new diff on the file again (rinse and repeat). For example, look at the following diff view. You can see that I added a lot of code to this file, but maybe during the process I added the namespace “Windows.UI.Xaml.Shapes” which is no longer needed now that I’ve finished the full edit. Traditionally I would have opened GamePage.cs, removed that line, saved it and then done a new Diff on the file. But because this Diff view is live on the file, I can just remove that line in the right hand pane and the Diff updates in real time. In the picture below, note that the green bar (indicating a new line inserted) is now gone from the red circle area. This is also a great way to find and eliminate all that commented out code that people sprinkle around as they make changes but you don’t want to check it in. (Of course always rebuild and test after you do this before you hit the commit button ) This post is part of a series of Visual Studio tips. The first post in the series contains the whole list.

Posted by on 29 May 2015 | 11:56 am

Using the media capture API in the browser

In the latest Windows 10 preview release, Microsoft added support for media capture APIs in our Edge browser for the first time. This feature is based on the Media Capture and Streams specification, developed jointly at the W3C by the Web Real-Time Communications Working Group and the Device APIs Working Group. Some developers may know it simply as getUserMedia, which… The post Using the media capture API in the browser appeared first on Dave Voyles | Tech Evangelist at Microsoft ....(read more)

Posted by on 29 May 2015 | 11:46 am

LiveLabs: testing the usefulness of mobile data

LiveLabs: testing the usefulness of mobile dataResearchers at a Singapore university are conducting real-time experimentation of mobile apps and services that require content-specific triggers—with real participants using their own smart phones....(read more)

Posted by on 29 May 2015 | 11:00 am

Kinect-enabled VR puts users in space with Earthlight

The following blog was guest-authored by Russell Grain, a development lead at Opaque Multimedia, a Melbourne (Australia)-based digital design studio specializing in the application of video game technologies in novel domains. Earthlight is a first-person exploration game where the players step into the shoes of an astronaut on the International Space Station (ISS). There, some 431 kilometers (about 268 miles) above the Earth, they look down on our planet from the comfort of their own spacesuit. Featuring the most realistic depiction yet of the ISS in an interactive virtual reality (VR) setting, Earthlight demonstrates the limits of what is visually achievable in consumer-oriented VR experiences.  Opaque Multimedia’s Earthlight game enables players to explore the International Space Station in aninteractive VR setting, thanks to the latest Kinect sensor. Our team at Opaque Multimedia developed Earthlight as a technical demo for our Kinect 4 Unreal plug-in, which exposes all the functionality of the latest Kinect sensor in Unreal Engine 4. Our goal was to create something visceral that demonstrated the power of Kinect as an input device—to show that Kinect could enable an experience that couldn’t be achieved with anything else. Players explore the ISS from a truly first-person perspective, in which the movement of their head translates directly into the viewpoint of a space suit-clad astronaut. To complete this experience, players interact with the environment entirely through a Kinect 4 Unreal powered avateering solution, pushing and pulling themselves along the surface of the ISS as they navigate a network of handles and scaffolds to reach the top of the communications array. Everyone behaves somewhat differently when presented with the Earth hanging below them. Some race straight to the top of the ISS, wanting to propel themselves to their goal as fast as possible. Others are taken with the details of the ISS’s machinery, and some simply relax and stare at the Earth. On average, players take about four minutes to ascend to the top of the station’s communications array.  By using Kinect, Earthlight enables players to explore the ISS without disruptions to the immersive VRexperience that a keyboard, mouse, or gamepad interface would create. As well as being a fantastic tool for building immersion in a virtual game world, Kinect is uniquely positioned to help solve some user interface challenges unique to the VR experience: you can’t see a keyboard, mouse, or gamepad while wearing any current generation virtual reality device. By using Kinect, not only can we overcome these issues, we also increase the depth of the experience. The enhanced experience offers a compelling new use case for the fantastic body-tracking capabilities of the Kinect for Windows v2 sensor and SDK 2.0: to provide natural and intuitive input to virtual reality games. The latest sensor’s huge increase in fidelity makes it possible to track the precise movement of the arms. Moreover, the Kinect-enabled interface is so intuitive that, despite the lack of haptic feedback, users still adopt the unique gait and arm movements of a weightless astronaut. They are so immersed in the experience that they seem to forget all about the existence of gravity. Earthlight has enjoyed a fantastic reception everywhere it’s been shown—from the initial demonstrations at GDC 2015, to the Microsoft Build conference, the Silicone Valley Virtual Reality meetup, and the recent appearance at We.Speak.Code. At each event, there was barely any reprieve from the constant lines of people waiting to try out the experience. We estimate more than 2,000 people have experienced Earthlight, and we’ve been thrilled with their reactions. When asked afterwards what they thought of Earthlight, the almost universal response was “amazing.” We look forward to engendering further amazement as we push VR boundaries with Kinect 4 Unreal. Russell Grain, Kinect 4 Unreal Lead, Opaque Multimedia Key links Kinect for Windows news and information Learn more about Opaque Multimedia See the Earthlight showcase Follow us on Facebook and Twitter

Posted by on 29 May 2015 | 11:00 am

Extending Visual Studio 2015

When we decided to expand Visual Studio 2015 in the hopes of making it *the* premier platform for cross-platform development, we acknowledged that the ability for our customers to extend, customize, and generally make it their own was absolutely, critically important. It is a win-win: make an extension and you get a product or tailored experience for your specific scenario, and we – Microsoft – see Visual Studio’s potential audience get a little bit bigger and a little bit happier. After 7 years of growing the Visual Studio Platform, it has become rather bulky and packed with features. In addition, it is hard to know where to start! We had to make it simpler. So with Visual Studio 2015, we set out on the long road of making it easier to develop extensions, giving you the functionality you need to make great integrated tools, frameworks, and languages, and connecting you with other extension authors in our ecosystem. Starting with Visual Studio 2013 Community, and continuing with Visual Studio 2015 Community, you can create and use Extensions in a free version of our IDE. Get started today by directly downloading the Visual Studio Extensibility Tools or checkout the Visual Studio 2015 RC Downloads page. Now let’s look at all the new ways in which you can make Visual Studio your own! Item templates: the convenient way to create basic extensions! In Visual Studio 2015, it is exceptionally easy to add new functionality to your extension. In earlier versions of VS, you depended on project templates and “merge projects” to get more functionality. Now, all of your favorite extensibility templates are available to use as item templates, which means adding a new menu command or editor feature is as easy as adding an item to an existing extensibility project. If you would like to see any other item templates, request them here. To get started with item templates, check out Creating Templates for Projects and Items in Visual Studio, Starting to Develop Visual Studio Extensions, and Creating an Extension with an Editor Item Template. Visual Studio Extensibility on GitHub We are now on GitHub! Share your open source extensions with the growing community of Visual Studio Extension authors on http://microsoft.github.io/extendvs. Submit a pull request to have your own extension’s repo added to the Community Extensions list. We look forward to seeing yourextensions in GitHub, it’s a great way to add your voice to the thousands of extension authors who help shape the Visual Studio Extensibility platform. Getting the VSSDK Just Got Easier In the past, when you opened an extensibility project in Visual Studio without having the VSSDK,you would get a cryptic error message in a big, confusing dialog box. Now, you can simply open the project directly – no fuss. If you don’t have the VS SDK installed, Visual Studio 2015 asks you to install it when you want to do things such as add an item template. To get started, download the VS SDK directly or check out the Visual Studio 2015 RC download page, under Additional Tools, to learn more about the SDK. In the final release of Visual Studio 2015, you the VS SDK is part of your initial setup. Visual Studio SDK reference assemblies via NuGet For increased portability, and sharing of extension projects, you can use the NuGet versions of the VSSDK Reference Assemblies. This makes it so you have access to everything you need for an extensibility project on any connected machine. No need to spend time downloading the SDK, and then adding the references into your project. When you author your extension project with the NuGet-based reference assemblies, it pulls everything you need directly into your project. New and improved docs We’ve overhauled the documentation to help you better discover the best walkthroughs and reduce the headache of finding the best APIs for the job. Get started with our up-to-date Visual Studio SDK walkthroughs. Whether you are updating a set of in-house developer tools, have a great idea for the next great VS productivity tool, or are creating the next big Visual Studio framework, we’d love to hear your feedback! Drop us a line at Heather.Brown@microsoft.com, add suggestions on UserVoice, or visit us on Stackoverflow with tag “visual-studio-extensions”. Heather Brown, Principal Program Manager, Visual Studio Platform Team @Heathbr_MSFT Heather Brown is a Principal Program Manager on the Visual Studio team where she aspires to create a Visual Studio that maintains the familiarity of our most prized developer tool, with new innovations that empower developers to do more.

Posted by on 29 May 2015 | 11:00 am

Choisir un algorithme dans Azure Machine Learning

L’environnement de développement visuel et collaboratif Azure ML Studio est livré avec un grand nombre d'algorithmes d'apprentissage automatique (Machine Learning) qui vous permet de construire vos solutions d’analyses prédictives et expériences associées. Ces algorithmes rentrent dans les catégories générales de la régression, de la classification, du clustering et de la détection d’anomalies, et chacun de ces algorithmes est conçu pour répondre à un type différent de Machine Learning. La question qui se pose est dès lors la suivante : Y-a-t-il quelque chose qui puisse m'aider à comprendre rapidement comment choisir un algorithme de Machine Learning pour ma solution spécifique ? Compte tenu de la variété d’applications des méthodes tirées du #MachineLearning, il n’est effectivement pas toujours aisé dans les faits de choisir un algorithme. C’est pourquoi nous vous proposions pour entrer de façon pragmatique dans le monde du #MachineLearning un certain nombre de billets dédiés à la compréhension des principes de fonctionnement et à la façon de les appliquer. C’est ainsi que, comme point de départ de ce tour d’horizon, nous avons tenté de répondre à la question : Le Machine Learning, comment ça marche ?. Les principes élémentaires étant posés, nous nous sommes ensuite intéressé à Un peu de théorie pour l’apprentissage supervisé – 1ère partie et 2nde partie afin d’aborder dans ses deux parties respectivement deux grandes familles de tâches réalisables, à savoir la classification et la régression. Est venu ensuite le temps de s’intéresser à Un peu de théorie pour l'apprentissage non-supervisé, avec 3 principales tâches réalisables à partir de méthodes associées : Le partitionnement des données (clustering). La détection d’éléments atypiques (outlier detection). La réduction de dimensions. L’Apprentissage non-supervisé appliqué à l’analyse de logs de proxy nous a donné l’occasion de partager un exemple de mise en œuvre. Fort de ce « bagage », nous avons pu échanger sur les approches pour Evaluer un modèle en apprentissage automatique. Vous n’avez rien de plus pragmatique ? nous direz-vous ! Quid de patrons prêts à l’usage ? Nous avons donc profité de la possibilité nouvelle qui vous est donnée de pouvoir partager avec la communauté votre solution ainsi créée d'un simple clic via la nouvelle galerie Azure ML pilotée par la communauté pour introduire des patrons (template) - pour ne pas dire modèle ici ;-) - prêts à personnaliser. Et si Vous souhaitez détecter des fraudes ? Faire des prévisions sur vos ventes ? de la classification de textes ?, nous avons pu introduire et décrire la mise en œuvre de trois patrons mise à disposition par le groupe produit Azure ML : Un premier sur la détection de la fraude en ligne, Un second sur les prévisions pour la vente en détail, Et un troisième visant la classification de textes. Depuis, pour encore mieux trouver votre « bonheur », de nouveaux patrons sont disponibles. Nous pouvons mentionner le patron sur la maintenance prédictive décrit dans le billet New Predictive Maintenance Template in Azure ML du blog du groupe produit. La maintenance prédictive englobe une variété de domaines, mais l'objectif général est d'accroître l'efficacité des tâches associées à la maintenance. La maintenance prédictive pilotée par les données collectées gagne en particulier une attention croissante dans l'industrie avec les applications de plus en plus nombreuses dans le domaine émergent de l'Internet des objets (IoT). A ce propos, nous en profitons pour rappeler l’annonce faite en mars dernier lors de l’évènement Microsoft Convergence 2015 de la suite IoT de Microsoft qui viendra compléter la suite IoT Azure d’ores et déjà disponible et dont fait partie Azure ML. Le patron proposé ici se concentre sur la réponse à la question de « Quand une machine en service sera en panne ? » et illustre ce processus de prédire des événements futurs de défaillance dans le scénario de moteurs d'avions ! Si cela vous effraie, nous pouvons également parler d’un autre patron sur la classification de texte décrit dans le billet de ce même blog Azure ML Text Classification Template du groupe produit. La classification automatique de textes – également connu sous le nom balisage de texte ou de catégorisation de texte – fait partie du domaine de l’analyse de texte. Son objectif consiste à assigner un morceau de texte non structuré à une ou plusieurs classes d'un ensemble prédéfini de catégories. Le morceau de texte peut être lui-même de beaucoup de types différents, par exemple un document, un article de dépêche, une requête de recherche, un email, un tweet, un ticket de support, un commentaire d’un client, un test de produit et ainsi de suite. Azure ML propose désormais un patron pour aider les scientifiques des données (data scientits) et les développeurs à facilement construire et déployer leurs solutions d'analyse de texte. Entre autres choses, ce patron peut être utilisé pour : Classer des articles de journaux dans des sujets. Organiser les pages web en catégories hiérarchiques. Filtrer le spam de messagerie. Effectuer des analyses de sentiment. Prédire l'intention des utilisateurs exprimée par le biais de requêtes de recherche. Router des tickets de support. Analyser les commentaires de clients. Vous ne trouvez pas votre « bonheur » dans tout cela et/ou souhaitez construire votre propre solution ? Vous n’avez pas une vue d’ensemble des algorithmes ? L’antisèche Algorithmes de Microsoft Azure ML vise à répondre à ce besoin ! La « Cheat Sheet » est conçue pour vous aider à passer au crible les algorithmes de Machine Learning disponibles et à choisir celui le mieux approprié pour votre solution d’analyse prédictive. Au travers d’un synoptique simple, cette dernière vous pose des questions sur la nature de vos données et le problème sur lequel vous travaillez et vous suggère alors un algorithme que vous pouvez essayer :-) L’antisèche est disponible ici en téléchargement : Microsoft Azure Machine Learning Algorithm Cheat Sheet Pour une discussion plus approfondie des différents types d'algorithmes et sur la façon dont ils sont utilisés en apprentissage automatique, et au-delà de tous les pointeurs que nous venons de rappeler sur blog, vous pouvez consulter l’article Comment choisir un algorithme dans Microsoft Azure Machine Learning. Pour obtenir la liste de l’ensemble des algorithmes de Machine Learning disponibles, vous pouvez consulter le lien Initialize Model au sein de l'aide sur les modules et les algorithmes. Nous en avons terminé de notre petite revue pour le choix d’un algorithme dans Azure ML. Nous aurons bientôt l’occasion de reparler de tout cela. Dans l’intervalle, pour plus d’informations sur Azure ML ou pour vous rafraichir la mémoire, nous vous invitons à aller lire ou relire les billets de ce blog (vous avez quelques liens à disposition ici), ceux publiés sur le Blog Machine Learning du groupe produit. N’oubliez pas non plus le Forum MSDN Machine Learning qui est là pour faciliter la création d’une communauté d’échanges sur le sujet.

Posted by on 29 May 2015 | 10:40 am