[Sample Of Mar. 06] How to operate Azure blob storage in universal Windows apps

Mar. 06 Sample : https://code.msdn.microsoft.com//How-to-operate-Azure-blob-f0210e2e Windows Azure storage class library now supports Windows Store/Phone app. This sample will show you how to operate Azure blob storage in universal Windows apps, including uploading/downloading/deleting files from blob storage. You can find more code samples that demonstrate the most typical programming scenarios by using Microsoft All-In-One Code Framework Sample Browser or...(read more)

Posted by on 5 March 2015 | 8:01 pm

Using the Multilingual App Toolkit with WPF Applications

One of the toolkits available to app developers enabling them to reach new audiences is the Multilingual App Toolkit (MAT).  Using this toolkit, you can easily add multiple language support to Store apps on Windows 8.1 and Windows Phone 8.1 as well as WPF applications on Windows.  The toolkit can also use the Microsoft Translator Service to automatically translate the string resources in your app to other languages. I would like to walk you through the process of using the MAT with WPF applications, where the process is slightly different than with Windows Store and Phone apps. Prerequisites Install Microsoft Visual Studio 2013 Update 4 Install Microsoft Multilingual App Toolkit Start Using MAT with WPF Apps Open the WPF project in Visual Studio In the project settings, Application…Assembly Info, set the Neutral Language to your language; I selected English (United States) In the Tools menu, select Enable Multilingual App Toolkit.  When you do this, a Resources.qps-ploc.xlf file will be automatically added to the project Properties.  This will be used to create “Pseudo” translation.  Pseudo Language is an artificial modification of the software product intended to simulate real language localization. The pseudo language can be used to detect potential localizability issues or bugs early in the project cycle, before the actual localization starts. In the Solution Explorer, open the Properties…..Resources.resx and in the top bar of the editor, change the Access Modifier from Internal to Public During the build process, the MAT will automatically update the .xlf files, keeping them in sync.  When you add new strings to the Resources.resx file, build the project and then right click on .xlf files in the solution explorer and select Generate machine translations. For the strings in Xaml, you should use the x:Static extension to have the strings point to the .resx file: <Window x:Class="MultilingualApp.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:properties="clr-namespace:MultilingualApp.Properties" Height="350" Width="525">     <Grid>         <StackPanel>             <TextBlock                  Text="{x:Static properties:Resources.TranslateMe}"/>             <Button                  Content="{x:Static properties:Resources.PressMe}"/>         </StackPanel>     </Grid> </Window> You should use the Resources.resx file for all of the localizable strings in your app. Add additional languages by right-clicking on the project, and selecting Add translation languages… You can use the Microsoft Translator Service to generate translations or send the .xlf files to translation services as it is in the localization industry standard XLIFF file format. In addition to language translations, the MAT also use the Microsoft Terminology APIs (Language Portal Provider).  This enables direct access to Microsoft’s product translation memories.  See this article for more details: http://blogs.msdn.com/b/matdev/archive/2014/07/01/new-version-of-microsoft-terminology-api-launched.aspx Testing Pseudo Language WPF uses the OS language by default so Thread.CurrentThread.CurrentUICulture and Thread.CurrentThread.CurrentCulture need to be manually set to ‘qps-PLOC’ since it is not a log-in language.  Here is how you can easily do that: namespace MultilingualApp {     using System.Diagnostics;     using System.Globalization;     using System.Threading;     using System.Windows; /// <summary> /// Interaction logic for App.xaml /// </summary> public partial class App : Application {     private CultureInfo cultureOverride = new CultureInfo("qps-PLOC");     public App()     {         if (Debugger.IsAttached == true && cultureOverride != null)         {            Thread.CurrentThread.CurrentUICulture = cultureOverride;            Thread.CurrentThread.CurrentCulture = cultureOverride;         }     } } You can now see what the Xaml above translated to with the Pseudo translation: Summary The Multilingual App Toolkit along with the Microsoft Translator Service has made a difficult and often expensive process of globalizing and localizing apps easy and straightforward.  Now it is easy to localize WPF application as well as Windows Store and Windows Phone apps. Thanks Thanks to Cameron Lerum from the MAT team for helping with this article.

Posted by on 5 March 2015 | 5:46 pm

3 new reasons to love Power Query!

Here they are: Performance improvements Microsoft Dynamics CRM Online connector New transformations Get the deets: http://blogs.office.com/2015/03/05/3-updates-excel-power-query/   Performance improvements We’ve made significant Performance improvements to Power Query in a couple of areas: Query load—Performance of loading queries has improved by about 2x-3x in this release, according to our benchmarks. Queries that used to take ~10 minutes before this update, now only take between 3-4 minutes. We encourage you to try your own scenarios and let us know if you’re hitting these Performance levels. Excel workbook import—We’ve improved latency when connecting to Excel workbooks from Power Query. You should see lower times to load previews, which translates into a more responsive experience in the Query Editor.   New transformations These transformations were already possible via custom formulas, but now have been made much more usable. Age and Subtract operations for Date/Time columns—When working with Date/Time columns, it’s often useful to calculate difference between two Date/Time columns (for example, order date add ship date) or to calculate the Age or Date/Time difference between a given date and “now.” These options are now available in the Date and Time menus, under the Transform and Add Column ribbon tabs.   Aggregate Columns: Option to disable column name prefix—The option to disable column name prefix has been expanded to the Aggregate Columns menu. (This option was added a few months ago to Expand Columns.)  With this option, users can decide upfront whether to include a name prefix based on the original column name to the Aggregate Columns output.       Enjoy!    - Ninja Ed

Posted by on 5 March 2015 | 4:54 pm

Project Webcast on Tuesday 10th March 2015

This Project webcast with Adrian and me will be recorded too – in case 9am PST isn’t a good time for you.  Some Project Online content as well as some on-premises news too!  Be there or… listen to the recording! http://summit.office.com/Whats_New_in_Project_Online_and_Project_Server_2013 Support Corner Webcast: What’s new in Project Online and Project Server 2013 Mar 10, 2015 9:00am PST March 10, 2015   9:00 AM Join Brian Smith and Adrian Jenkins for a Support Corner Webcast.  This webcast will cover recent updates to Project Online as well as details of the March 2015 updates for our on-premises customers of Project and Project Server 2013..  Check back soon for additional details! Brian Smith is a Senior Escalation Engineer in the project support team and has been supporting Project Server for as long as it has existed – and is now also supporting Project Online and Project Pro for Office 365.  Adrian Jenkins has been at Microsoft for 25 years and has spent most of his time working on Microsoft Project in one capacity or another. Currently, he is a Senior Escalation Engineer and works heavily with the cumulative update release process, future directions and so forth. Webcast Details: 9:00 AM PST-10:00 AM PST Join Lync Meeting Conference ID: 704980577

Posted by on 5 March 2015 | 4:12 pm

Power BI Dynamic Date Filtering

One question I get from time to time is how to filter to the last week's worth of data automatically in Power BI.  You might want the last 7 days, this week, this month, etc.  You can make it fully dynamic by combining the technique Matt Masson describes here to create a date table and add to it some Power Pivot DAX functions to classify each date into the date range you'd like.  Download a sample here OneDrive personal doesn't load the Power View sheet so download and open the example in Excel on your desktop.  HTH, -Lukasz  

Posted by on 5 March 2015 | 3:48 pm

How to use automatic approval with Project Budgets and Workflow

Project Budgets require workflow to be able to create new budgets.  With some businesses, this is not necessary to have each budget reviewed before allowing them to be approved.  Especially if those with the responsibility to put in the Project Budget has the final say on the Budget regardless.  Since AX requires Workflow for Project Budgets, the best way to get around this is to setup the workflow on Project Budgets to be set to auto-approve with a simple approval workflow.   Steps on Auto Approval of Project Budgets: 1. Navigate to Project Management and Accounting -> Setup -> Project Management and Accounting Workflows.2. Select the Project Budget Workflow and click Workflow -> Maintain -> Edit. 3. Select the Project Original Budget Approval node.   4. Click Workflow -> Show -> Properties.5. Select Automatic Actions.6. Checkmark Enable Automatic Actions.7. Enter in the condition in which it will occur on.  (Note: This condition can be anything desired for what you would want to be auto-approved.  However, to get everything, using Project budget.Total cost budget for all transaction types being >= to 0.00 of the Legal Entities Currency)8. Select Approve for Auto Complete Action. Note: These same steps could have a setup for Project Budget Revisions and Project Timesheets as well.   Enjoy! Don C.

Posted by on 5 March 2015 | 2:35 pm

Playing audio to the earpiece from a Windows Phone 8.1 universal app

Some time ago I blogged about the Windows Phone AudioRoutingManager API which allows you to put a "Speakerphone" and "Bluetooth" button in your Windows Phone app. A common question that I get now is "I'm trying to play audio to the earpiece from my app, but AudioRoutingManager::SetAudioEndpoint(AudioRoutingEndpoint_Earpiece) is failing." It's an interesting question, because Windows Phone will automatically route your audio to the earpiece if you have everything set up right - and if you don't have everything set up just right, you can't route it to the earpiece at all! So how do you "set things up right?" There are two things you have to do. Tag the audio in question as "communications" Tag your app as either a "voice over IP" app or a "voicemail" app If you do both of these things, audio will flow to the earpiece automatically - no call to SetAudioEndpoint needed. (At this point, if you want to get fancy, you can put a "Speakerphone" in your app and wire it up to a call to SetAudioEndpoint(AudioRoutingEndpoint_Speakerphone), but that's up to you.) Let's look at the two things in a little more detail. Tag the audio in question as "communications" How to do this depends on what API you're using. It could be as simple as <audio src="..." msAudioCategory="communications" msAudioDeviceType="communications" />. Or you might have to call IAudioClient2::SetClientProperties with an AudioClientProperties structure whose AudioClientProperties.eCategory = AudioCategory_Communications. Tag your app as "voice over IP" or "voicemail" You'll need to set either the ID_CAP_VOIP or ID_CAP_VOICEMAIL Phone capability on your app. (The docs reference an ID_CAP_AUDIOROUTING capability, but that doesn't exist.) If you're writing a Silverlight app, you can do that directly in the manifest. If you're writing a Windows Phone 8.1 (non-Silverlight) or Universal app, you have to create a WindowsPhoneReservedAppInfo.xml file and add it to your application package. It should look like this. <?xml version="1.0" encoding="utf-8"?><WindowsPhoneReservedAppInfo xmlns="http://schemas.microsoft.com/phone/2013/windowsphonereservedappinfo">  <SoftwareCapabilities>    <SoftwareCapability Id="ID_CAP_VOIP" />  </SoftwareCapabilities></WindowsPhoneReservedAppInfo>

Posted by on 5 March 2015 | 2:15 pm

Issues with MSDN Subscription Product downloads 03/05 - Investigating

Update: Thu 03/05/2015 10:46 PM UTC Our DevOps is still investigating this issue actively to resolve as quickly as possible. We apologize for the inconvenience and appreciate your patience. -MSDN Service Delivery Team -------------------------------------------------------------------------------------------------------------- Initial Update: Thu 03/05/2015 07:22 PM UTC We are experiencing an intermittent issue with product downloads. Where customers might get error intermittently when try to download products from Subscription page. Dev-Ops is investigating the issue and actively working on mitigation. We apologize for the inconvenience and appreciate your patience. -MSDN Service Delivery Team

Posted by on 5 March 2015 | 2:13 pm

Cloud Customer Relationship Management Competency – Available Now!

  In January I announced the impending launch of our newest Cloud Performance Competency, Cloud Customer Relationship Management. This week, we’ve flipped the status switch from “Coming Soon” to “Available Now” - in doing so, creating a world of opportunity for emerging and established CRM Online partners. See Director of Microsoft Partner Network Product Management Niamh Coleman’s launch post to learn everything there is to know about the new Competency; Partner value proposition CRM Online market opportunity Attainment requirements Full benefits Partner resources Check out “Dynamics Partner Readiness For You!”, a recent post by my friend and colleague, Sarah Arnold, for the latest in Dynamics partner news and training. Want to learn more about the Microsoft Partner Network and Competencies? See my recent four post series “Your unfair cloud advantage – Microsoft Partner Network 101” or review the Microsoft Partner Network Quickstart Guide.     Already a partner? View your benefit statement and access your benefits through the partner portal. Join the discussion in the Microsoft Australia Small Business Reseller LinkedIn Group or the Microsoft Australia MPN Yammer Community. Need Support? Contact the Regional Service Centre on 13 20 58, Options 2,4,1 (Australian partners only) or visit the Partner Support Community.

Posted by on 5 March 2015 | 2:00 pm

Tempo de suporte para produtos

Se você é um gestor de TI ou administrador de rede deve ficar atento ao tempo de vida dos softwares (e hardwares) da sua rede. Independente do fabricante todos os produtos (sejam eles softwares ou hardwares e até mesmo outros itens de consumo) tem um ...read more...(read more)

Posted by on 5 March 2015 | 1:36 pm

Support for Consumer Groups, Statistical Functions, and Locale

The Stream Analytics team is pleased to announce several new features available today. Support for Event Hub Consumer Groups Last month's update added REST API support for specifying Consumer Groups on Event Hub inputs and this is now surfaced in the Azure Management Portal. Consumer groups enable multiple consuming applications to each have a separate view of the event stream, and to read the stream independently at its own pace and with its own offsets. When consumer group is not specified, Stream Analytics jobs will use the Default Consumer Group to ingest data from the Event Hub.  Note that associating your Stream Analytics job with a specific Consumer Group is the way to work around the 5 receiver maximum limitation that many customers have reached out to us about on the forum. Support for Statistical Functions Based on customer requests, we have extended the built-in functions for the Azure Stream Analytics Query Language to include the statistical functions VAR, VARP, STDEV, and STDEVP. Locale Support You can now specify the internationalization preference for a Stream Analytics job, which affects how the job will parse, compare, and sort incoming data. This setting is available on the Configure tab of a job. Globalization and Localization The Stream Analytics extension in the Azure Management Portal is now available in the following 11 languages: Simplified Chinese, Traditional Chinese, English, French, German, Italian, Japanese, Korean, Portuguese (Brazil), Russian, Spanish. Thanks! Thank you for your continued feedback on Stream Analytics. We look forward to hearing from you on the Stream Analytics Forum and Azure Feedback Forum. Stay tuned for more product updates coming soon!

Posted by on 5 March 2015 | 12:27 pm

Dynamics CRM Online CDU Information

  If you are using Dynamics CRM Online , you may ask why the last week of the month is not available for CDU schedule, such as upgrading to CRM 2015. This is due to the fact that it is reserved for security patching and service updates as some may not be compatible with the CDU and need to be performed separately. Operational lockdowns (such as events) might also result in CDU dates not being available. There will be no exceptions for those dates as we cannot delay a site-wide update. More info on CDU: https://technet.microsoft.com/en-us/library/dn308237.aspx   Best Regards EMEA Dynamics CRM Support Team Share this Blog Article on Twitter Tweet Follow Us on Twitter Follow @MSDynCRMSupport

Posted by on 5 March 2015 | 12:06 pm

Accessing the Bing Maps REST services from various JavaScript frameworks

On the Bing Maps forums I often come across developers who have difficulty accessing the Bing Maps REST services using different JavaScript frameworks such as jQuery and AngularJS. One common point of confusion is that passing in a REST service request URL into the address bar of a browser works, but passing that same URL into an AJAX method in JavaScript doesn’t. AJAX allows web pages to be updated asynchronously by exchanging small amounts of data with the server behind the scenes. This means that it is possible to update parts of a web page, without reloading the whole page. In this blog post we are going to take a quick look at how to access the Bing Maps services using different JavaScript frameworks. Using pure JavaScript When using JavaScript on its own one of the most common methods used to access data using AJAX is to use the XMLHttpRequest object. Unfortunately, until the introduction of Cross-Origin Resource Sharing (CORS) the XMLHttpRequest was limited to accessing resources and services that were hosted on the same domain as the webpage requesting them. CORS is now supported in most modern browsers and became a W3C standard in Janaury 2014 but it also needs to be enabled on the server of the service. Bing Maps REST services have been were released many years ago and currently do not support the use of CORS. Note that it is also worth knowing that there are many older browses that are still in use today that do not support CORS. When the Bing Maps REST services were released most web browsers did not support CORS and a different technique was commonly used to make cross domain calls to services, JSONP. JSONP is a technique which takes advantage of the fact that web browsers do not enforce the same-origin policy on script tags. To use JSONP with the Bing Maps REST services you add a parameter to the request URL like this &jsonp=myCallback. You then append a script tag to the body of the webpage using this URL. This will cause the browser to load call the service and load it like a JavaScript file, except that the JSON response will be wrapped with some code to a function called myCallback. Here is a reusable function you can use to add a script tag to the body of a webpage. function CallRestService(request) { var script = document.createElement("script"); script.setAttribute("type", "text/javascript"); script.setAttribute("src", request); document.body.appendChild(script);} Now, let’s say you want to be able to call the REST services to geocode a location and call a function called GeocodeCallback. This can be done using the following code: var geocodeRequest = "http://dev.virtualearth.net/REST/v1/Locations?query=" + encodeURIComponent("[Location to geocode]") + "&jsonp=GeocodeCallback&key=YOUR_BING_MAPS_KEY";CallRestService(geocodeRequest);function GeocodeCallback(result) { // Do something with the result} Notice that we wrapped the location we wanted to geocode with a function called encodeURIComponent. This is a best practice as web browsers can have difficulty with special characters, especially if you use the ampersand symbol in your location. This significantly increases the success rate of your query. Additional tips on using the REST services can be found here. A full code sample of using the REST geocoding service in JavaScript using JSONP is documented here. Using jQuery JQuery (http://jquery.com) is a very popular JavaScript framework that makes it easier to developer JavaScript that works across different browsers. jQuery provides a three of different functions to make HTTP GET requests to services; jQuery.ajax ($.ajax), jQuery.get ($.get) and jQuery.getJSON ($.getJSON). The jQuery.get and jQuery.getJSON function is meant to be a simplified version of the jQuery.ajax function but have less functionality. The jQuery.get and jQuery.getJSON functions do not support cross-domain requests or JSONP whereas the jQuery.ajax function does. In order to make a cross-domain request using the jQuery.ajax function you have to specify that it uses JSONP and set the dataType property to JSONP. For example, here is a reusable function that uses the jQuery.ajax function to download the results from the specified REST URL request and send them to a callback function. function CallRestService(request, callback) { $.ajax({ url: request, dataType: "jsonp", jsonp: "jsonp", success: function (r) { callback(r); }, error: function (e) { alert(e.statusText); } });} Here is an example of how you can use this function to geocode a location and have the results sent to a callback function called GeocodeCallback. var geocodeRequest = "http://dev.virtualearth.net/REST/v1/Locations?query=" + encodeURIComponent("[Location to geocode]") + "&key=YOUR_BING_MAPS_KEY";CallRestService(geocodeRequest, GeocodeCallback);function GeocodeCallback(result) { // Do something with the result} Using AngularJS AngularJS is an open source JavaScript framework that lets you build well structured, easily testable and maintainable front-end applications by using the Model-View-Controller (MVC) pattern. When making HTTP requests using AngularJS the $http.get function is often used. This function wraps the XMLHttpRequest object, and as mentioned earlier, this doesn’t work when making cross-domain requests unless CORS is enabled on the server. AngularJS has another function $http.jsonp which allows you to easily make JSONP requests. Using this instead of the $http.get function works well. For example, here is a reusable function that uses the $http.jsonp function to download the results from the specified REST URL request and send them to a callback function. function CallRestService(request, callback) { $http.jsonp(request) .success(function (r) { callback(r); }) .error(function (data, status, error, thing) { alert(error); });} Here is an example of how you can use this function to geocode a location and have the results sent to a callback function called GeocodeCallback. var geocodeRequest = "http://dev.virtualearth.net/REST/v1/Locations?query=" + encodeURIComponent("[Location to geocode]") + "&jsonp=JSON_CALLBACK&key=YOUR_BING_MAPS_KEY";CallRestService(geocodeRequest, GeocodeCallback);function GeocodeCallback(result) { // Do something with the result} Notice how we have to specify the JSONP parameter of the REST request URL to JSON_CALLBACK. What about HTTP POST requests? So far we have mainly focused on making HTTP GET requests as majority of the features in the Bing Maps REST services are accessed in that way. GET requests are simple and fast but are limited by the length of the URL which is limited to 2083 characters. There is however a few features in the REST services where you can make HTTP POST requests so that you can pass in more data than what is supported by GET requests. - The Elevation service allows you to POST coordinate data. This is useful as this service allows you to pass up to 1024 coordinates in a single request. - The Imagery service allows you to POST data for up to 100 pushpins when generating a static image of a map. JSONP does not support HTTP POST requests which limits us quite a bit. There is however a way around this which consists of creating a proxy service and hosting it on the same server as your webpage and then have it make the POST request to Bing Maps. If using Visual Studio, a Generic handler can be easily turned into a proxy service. To do this, open Visual Studio and create a new ASP.NET Web Application called BingMapsPostRequests. Screenshot: Creating ASP.NET Web Application Project Next add a Generic Handler by right clicking on the project and selecting Add -> New Item and selecting the Generic Handler option from the web template section. Call it ProxyService.ashx. Screenshot: Adding Generic Handler Open the ProxyService.ashx.cs file and update it with the following code. This code will allow the proxy service to take in three parameters; a url to make the proxy request for, a type value which indicates if a HTTP GET or POST request should be made by the proxy service, and a responseType value which indicates the content type of the response data. If the response type is an image, the response will be a 64bit encode string which will make it easier to work within JavaScript. using System;using System.IO;using System.Net;using System.Web;namespace BingMapsPostRequests{ public class ProxyService : IHttpHandler { public void ProcessRequest(HttpContext context) { //Get request URL from service. string source = context.Request.QueryString["url"]; //Get the type of request to make. string type = context.Request.QueryString["type"]; //Get the response type of the response data. string responseType = context.Request.QueryString["responseType"]; //Default the response type of JSON if it is not specified. if (string.IsNullOrEmpty(responseType)) { responseType = "application/json"; } //Do not cache response context.Response.Cache.SetCacheability(HttpCacheability.NoCache); context.Response.ContentType = responseType; context.Response.ContentEncoding = System.Text.Encoding.UTF8; //Make required cross-domain POST request. var request = (HttpWebRequest)HttpWebRequest.Create(source); if (string.Compare(type, "POST", true) == 0) { request.Method = "POST"; request.ContentType = "text/plain;charset=utf-8"; var reader = new StreamReader(context.Request.InputStream); var encoding = new System.Text.UTF8Encoding(); byte[] bytes = encoding.GetBytes(reader.ReadToEnd()); request.ContentLength = bytes.Length; using (var requestStream = request.GetRequestStream()) { // Send the data. requestStream.Write(bytes, 0, bytes.Length); } } using (var response = (HttpWebResponse)request.GetResponse()) { //Format the response as needed and return to proxy service response stream. if (responseType.StartsWith("image")) { using (var ms = new MemoryStream()) { response.GetResponseStream().CopyTo(ms); byte[] imageBytes = ms.ToArray(); // Convert byte[] to Base64 String string base64String = Convert.ToBase64String(imageBytes); context.Response.Write("data:" + responseType + ";base64," + base64String); } } else { using (var stream = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII)) { context.Response.Write(stream.ReadToEnd()); } } } } public bool IsReusable { get { return false; } } }} Next add an HTML file by right clicking on the project and selecting Add -> New Item and selecting the HTML Page template. Call it index.html. Screenshot: Creating and HTML page Open the index.html page and update it with the following code. This code will add two buttons to the webpage which show how to use the proxy service to make requests the Bing Maps REST elevation and imagery services. <!DOCTYPE html><html xmlns="http://www.w3.org/1999/xhtml"><head> <title></title> <script type="text/javascript" src="http://code.jquery.com/jquery-1.11.1.min.js"></script> <script type="text/javascript"> var bingMapsKey = 'YOUR_BING_MAPS_KEY; var proxyService = "ProxyService.ashx?type=POST&url="; function GetElevationData() { var request = 'http://dev.virtualearth.net/REST/v1/Elevation/List?&key=' + bingMapsKey; //Generate mock data var coords = []; for (var i = 0; i < 100; i++) { coords.push(GetRandomCoordinate()); } var postData = 'points=' + coords.join(','); //Call service CallPostService(request, postData, null, function (r) { document.getElementById('outputDiv').innerHTML = r; }); } function GetStaticImage() { var request = 'http://dev.virtualearth.net/REST/v1/Imagery/Map/Road/?key=' + bingMapsKey; //Generate mock data var pushpins = []; for (var i = 0; i < 100; i++) { pushpins.push('pp=' + GetRandomCoordinate() + ';23;'); } var postData = pushpins.join('&'); //Call service CallPostService(request, postData, 'image/png', function (r) { document.getElementById('outputDiv').innerHTML = '<img src="' + r + '"/>'; }); } function CallPostService(requestUrl, postData, responseType, callback) { responseType = (responseType) ? responseType : ''; $.ajax({ type: 'POST', url: proxyService + encodeURIComponent(requestUrl) + '&responseType=' + responseType, dataType: "text", contentType: 'text/plain;charset=utf-8', data: postData, success: function (r) { callback(r); }, error: function (e) { document.getElementById('outputDiv').innerHTML = e.responseText; } }); } function GetRandomCoordinate() { //Limit coordinates to a small region in the US. var lat = Math.random() * 10 + 30; //Latitude range 30 - 40 degrees var lon = Math.random() * 10 - 100; //Longitude range -90 - -100 degrees return lat + ',' + lon; } </script></head><body> <input type="button" value="Get Elevation Data" onclick="GetElevationData();"/> <input type="button" value="Get Static Image" onclick="GetStaticImage();" /> <br/><br /> <div id="outputDiv" style="width:500px;"></div></body></html>   When the first button is pressed it will make a request to get the elevations of 100 random coordinates in the USA. This will result in the page looking something like this: Screenshot: Response from POST Elevation request When the second button is pressed it will make a request to get a static map image with 100 random pushpins. The response from the proxy service is a 64 bit encode string which we can pass as the source for an image tag to render the image on the page like this. Screenshot: Response from POST Imagery request Full source code for the proxy service example can be downloaded here. - Ricky Brundritt, Bing Maps Program Manager

Posted by on 5 March 2015 | 11:56 am

Enlightened MSDN - Understanding the Benefits of MSDN

Development teams that have MSDN subscriptions see immense value in the offering.  Yes, the subscription provides the user assigned the MSDN license the ability to download enterprise-grade software from Microsoft to use in Dev and Test capacities, but there is so much more to MSDN than just software.  On this episode of Breakpoint, Jonathan, Paul, and guest expert Marco go through the ins and outs of the MSDN subscription offering, including the various tiers of MSDN subscriptions and the associated benefits (which include gold nuggets like free training, Azure benefits and partner software discounts to name a few). Have Questions for Paul and Jonathan? While watching the episode, post your comments below, tweet your question to @cdndevs with the hashtag #breakpointca, post your question on the Microsoft Developer Facebook wall, or start a new conversation in the Canadian Developer Connection group on LinkedIn. On the Next Episode of Breakpoint The Power of the Manual Tester and How Visual Studio Makes You Even Better When you ask people that build software what some of the common tasks are, a large majority of responses include the word "testing." Testing is IMPORTANT and every software project needs it. In fact, the sooner testing is introduced into the lifecycle, the more effective it is. In this episode of Breakpoint, we are going to discuss the art of Manual Testing. By the end of the episode, you will know different strategies to implement within your software project, how to integrate manual testing into your software, and when manual testers should be involved. You will also learn how the Visual Studio suite of manual testing tools make this job very easy for testers, as well as how these tools you can leverage the data from your test runs to gain insights on how healthy your project is. REGISTER TO RECEIVE A REMINDER >>

Posted by on 5 March 2015 | 11:29 am

Webinar: Hear from Consolidated Edison on how to improve capital investment governance on March 12th at 2PM EST

Faced with ever-changing demands – Consolidated Edison needed a standard yet agile approach to continuously maximize the strategic value of its capital investment portfolio all year round. Consolidated Edison delivers power to approximately 3.7 million electric and 1.2 million gas customers in the world’s most dynamic and demanding marketplace – metropolitan New York. It’s a challenging job and an essential service that Con Edison does safely and reliably. Yet, more and more, priorities have been compounded by climate change, customers concerned with costs, sustainability, and changing regulations. Increasingly, it’s become critical to track and evaluate where every dollar of the utility’s $2 billion annual capital budget is spent, and to fund projects that best align with Con Edison’s long-term strategy and the optimal balance of stakeholders’ needs. Join Microsoft partner UMT Consulting Group and Frank LaRocca, Dir. of Business Improvement Services at Con Edison, for a free webinar on Thursday, March 12th at 2PM eastern standard time (EST), dedicated to PMO, finance, operations and program management executives from Utilities and other large enterprises: Learn from ConEd’s experiences and insights on building governance and an enterprise project management office for multi-billion dollar capital investment portfolios. Hear about the latest innovations in agile operating models that deliver and exceed historical performance across electric, gas and steam operations. To hear more details on the Con Edison case study go here and to register for this free webinar please go here. – Jon C. Arnold

Posted by on 5 March 2015 | 11:03 am

Successful Digital Vision Starts at the Top

Business change is tough.   Just try it at Cloud speed, and you’ll know what I mean. That said, digital business transformation is reshaping companies and industries around the world, at a rapid rate. If you don’t cross the Cloud chasm, and learn how to play in the new digital economy,  you might just get left behind. Sadly, not every executive has a digital vision. That’s a big deal because the pattern here is that successful digital business transformation starts at the top of the company.  And it starts with digital vision. But just having a digital vision is not enough. It has to be a shared transformative digital vision.   Not a mandate, but a shared digital vision from the top, that’s led and made real by the people in the middle and lower levels. In the book, Leading Digital: Turning Technology into Business Transformation, George Westerman, Didier Bonnet, and Andrew McAfee, share how successful companies and executives drive digital business transformation through shared transformative digital visions. Employees Don’t Always Get the WHY, WHAT, or HOW of Digital Business Transformation You need a digital vision at the top.   Otherwise, it’s like pushing rocks uphill.  Worse, not everybody will be in the game, or know what position they play, or even how to play the game. Via Leading Digital: Turning Technology into Business Transformation: “The changes being wrought through digital transformation are real.  Yet, even when leaders see the digital threats and opportunity, employees may need to be convinced.  Many employees feel they are paid to do a job, not to change that job.  And they have lived through big initiatives in the past that failed to turn into reality.  To many, digital transformations is either irrelevant or just another passing fad.  Still other people may not understand how the change affects their jobs or how they might make the transition.” Only Senior Executives Can Create a Compelling Vision of the Future Digital business transformation must be led.   Senior executives are in the right position to create a compelling future all up, and communicate it across the board. Via Leading Digital: Turning Technology into Business Transformation: “Our research shows that successful digital transformation starts at the top of the company.  Only the senior-most executives can create a compelling vision of the future and communicate it throughout the organization.  Then people in the middle and lower levels can make the vision a reality.  Managers can redesign process, workers can start to work differently, and everyone can identify new ways to meet the vision.  This kind of change doesn't happen through simple mandate.  It must be led. Among the companies we studied, none have created true digital transformation through a bottom-up approach.  Some executives have changed their parts of the business--for example, product design and supply chain at Nike--but the executives stopped at the boundaries of their business units.  Changing part of your business is not enough.  Often, the real benefits of transformation come from seeing potential synergies across silos and then creating conditions through which everyone can unlock that value.  Only senior executives are positioned to drive this kind of boundary-spanning change.” Digital Masters Have a Shared Digital Vision (While Others Do Not) As the business landscape is reshaping, you are either a disruptor or the disrupted.  The Digital Masters that are creating the disruption in their business and in their industries have shared digital visions, and re-imagine their business for a mobile-first, Cloud-first world, and a new digital economy. Via Leading Digital: Turning Technology into Business Transformation: “So how prevalent is digital vision? In our global survey of 431 executives in 391 companies, only 42 percent said that their senior executive had a digital vision.  Only 35 percent said the vision was shared among senior and middle managers.  These numbers are surprisingly low, given the rapid rate at which digital transformation is reshaping companies and industries.  But the low overall numbers mask an important distinction.  Digital Masters have a shared digital vision, while others do not.  Among the Digital Masters that we surveyed, 82 percent agreed that their senior leaders shared a common vision of digital transformation, and 71 percent said it was shared between senior and middle managers.  The picture is quite different for firms outside our Digital Masters category, where less than 30 percent said their senior leaders had a shared digital vision and only 17 percent said the shared vision extended to middle management.” Digital Vision is Not Enough (You Need a Transformative Digital Vision) It’s bad enough that many executives don’t have a shared digital vision.   But what makes it worse, is that even fewer have a transformative digital vision, which is the key to success in the digital frontier. Via Leading Digital: Turning Technology into Business Transformation: “But having a shared digital vision is not quite enough.  Many organizations fail to capture the full potential of digital technologies because their leaders lack a truly transformative vision of the digital future.  On average, only 31 percent of our respondents said that they had a vision which represented radical change, and 41 percent said their vision crossed internal organizational units.  Digital Masters were far more transformative in their vision, with two-thirds agreeing they had a radical vision, and 82 percent agreeing their vision crossed organizational silos.  Meanwhile, nonmasters were far less transformative in their visions.” Where there is no vision, the businesses perish. You Might Also Like Cloud Changes the Game from Deployment to Adoption Drive Business Transformation by Reenvisioning Operations Drive Business Transformation by Reenvisioning Your Customer Experience Dual-Speed IT Drives Business Transformation and Improves IT-Business Relationships How Leaders are Building Digital Skills How To Build a Better Business Case for Digital Initiatives How To Improve the IT-Business Relationship How To Build a Roadmap for Your Digital Business Transformation

Posted by on 5 March 2015 | 10:49 am

Adding gulp to Visual Studio Online Builds

At one of our Code in the Cloud camps someone asked me how they could integrate front-end build tasks in the Visual Studio Online (VSO) build process. Lately we’ve been adding great support for gulp/grunt/bower into our Visual Studio IDE (check out this post from Scott Hanselman to learn more). But I have to admit that I’d never included this in a VSO Team Build.  In this post I’ll show you how you can integrate a front-end build process in Visual Studio Online Build. Disclaimer: the process described below is a demonstration of how you can integrate gulp into a Visual Studio Online Team Build process. This is by no means intended to be a best practice and to keep things simple, below scripts lack extensive error handling.   Step 1: Creating a Visual Studio Online Team Project Visual Studio Online is Microsoft’s application lifecycle management solution in the cloud. It allows you to manage all aspects of you development process, from planning what needs to be done, to source control to manage your source code files, to hosted build infrastructure and even load test infrastructure. Visual Studio Online is free for teams of up to 5 members. In the context of this post, we’ll be using the source control repository and hosted build services of VSO. Once you’re signed up for Visual Studio Online, you can create your first Team Project. In this blog post we’ll be using git-based source control; this will allow you to connect to the source code repository using any existing git client. Note that VSO source control can be used for any development technology and programming language. To create a new Team Project, navigate to your VSO account (.visualstudio.com">.visualstudio.com">http://<account>.visualstudio.com) and clicking the ‘New…’ link. Enter the details and press the ‘Create Project’ button.   Step 2: Setting up the Web Application project To keep things simple, we’ll create a basic Lorem Ipsum generator website using ASP.NET and Visual Studio, however you could use any other web technology. Make sure that in Visual Studio you’re connected to the Team Project (from Team Explorer, Connect to Team Projects , and then Select Team Projects…). In Team Explorer, click the Home button and press the New… link to create a new solution that is added to source control. Now create an empty ASP.NET Web Application project. Secondly, add an HTML file ‘index.html’ to the project. As you press the ‘Submit’ button, the generateLoremText JavaScript function is invoked. This function is included in a separate JS file, which will minimize using gulp. 1: <!DOCTYPE html> 2: <html> 3: <head> 4: <script src="js/lorem.min.js"></script> 5: </head> 6: <body> 7: <h1>Lorem Ipsum Generator</h1> 8: 9: <p>How many paragraphs do you need (1-100):</p> 10: 11: <input id="countInput" type="number"> 12: 13: <button type="button" onclick="generateLoremText()">Submit</button> 14: 15: <p id="validationText"></p> 16: 17: <hr /> 18: 19: <p id="loremText"></p> 20: 21: </body> 22: </html> Now add a folder ‘js’ to the project, which will contain our client-side JavaScript code. Then add a new JavaScript file ‘lorem.js’ to this folder. The script is fairly easy and self-explanatory. 1: function generateLoremText() { 2: var count, validationText, loremText; 3: 4: loremText = ""; 5: validationText = ""; 6: 7: count = document.getElementById("countInput").value; 8: 9: if (isNaN(count) || count < 1 || count > 100) { 10: validationText = "Invalid input"; 11: } else { 12: loremText = createLoremText(count); 13: } 14: 15: document.getElementById("validationText").innerHTML = validationText; 16: document.getElementById("loremText").innerHTML = loremText; 17: } 18: 19: function createLoremText(count) 20: { 21: var loremText = ""; 22: 23: for (i = 0; i < count; i++) 24: { 25: loremText = loremText + "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum."; 26: } 27: 28: return loremText; 29: } The project structure should now look something like this:   Step 3: Configuring gulp for your web application We’re going to use gulp to perform our front-end build tasks; you could just as well integrate with grunt or bower. As mentioned before, Visual Studio already provides grunt/gulp/bower integration in the IDE, however this will run these build tasks on the developer machine and not as part of our centralized build process. There are basically 2 approaches for customizing the Visual Studio Team Build process: Create a custom Team Build template to include activities for invoking the front-end build tasks.  The team at MS OpenTech has created a custom build template and corresponding activities to invoke Grunt as part of your Team Build. Alternatively, the default Build templates in TFS and Visual Studio Online now have the possibility to invoke pre and post build scripts. In this script you can then invoke the grunt/gulp/bower commands. In this blog post we’ll take this second option as it allows to quickly extend the default build process without the overhead of creating a custom build template. To achieve this, we are going to add 3 files to our Web Application project: gulpfile.js: contains the gulp script describing all build tasks that need to be performed. package.json: contains the list package dependencies for the gulp script clientcompile.bat (note: you can also use PowerShell): batch script that install the dependency packages and then invokes the gulp command Your project structure should now look like this: The gulpfile.js file contains the different gulp tasks – if you require plugins, make sure to add the necessary dependencies in the package.json file. Below script will minimize (uglify) our lorem.js script and rename it to lorem.min.js. You’ll notice that I’m passing in a parameter (PATH) that defines the output path for the minimized script – I’m using the yargs gulp plugin for this.  This path will resolve to the ‘_PublishedWebsites/…’ folder on the Team Build drop location. 1: 'use strict'; 2: 3: var gulp = require('gulp'), 4: rename = require('gulp-rename'), 5: uglify = require('gulp-uglify'), 6: argv = require('yargs').argv; 7: 8: var DEST = argv.PATH; 9: if (!DEST) { DEST = '.'; } 10: 11: gulp.task('default', function() { 12: return gulp.src('**/js/lorem.js') 13: // This will minify and rename to lorem.min.js 14: .pipe(uglify()) 15: .pipe(rename({ extname: '.min.js' })) 16: .pipe(gulp.dest(DEST)); 17: }); 18: Our package.json file will contain the necessary packages to be installed: 1: { 2: "name": "GruntVSO", 3: "version": "1.0.0", 4: "description": "Grunt VSO integration sample", 5: "main": "index.js", 6: "author": "Nick Trogh", 7: "license": "MIT", 8: "repository": { "url": "http://github.com" }, 9: "readme": "Grunt VSO integration sample", 10: "dependencies": { 11: "gulp": "^3.8.11", 12: "gulp-rename": "^1.2.0", 13: "gulp-uglify": "^1.1.0", 14: "yargs": "^3.4.5" 15: } 16: } Finally the clientcompile.bat file will invoke the different pieces of the front-end build process: 1: REM Script arguments 2: REM [1] project source folder 3: REM [2] output folder 4: 5: REM Create the npm folder in the user profile to fix a bug with npm on Windows 6: MKDIR C:\Users\buildguest\AppData\Roaming\npm 7: 8: REM Set the working folder to the project source folder to correctly install npm packages locally 9: CD "%TF_BUILD_SOURCESDIRECTORY%%1" 10: 11: REM Install all dependencies through npm 12: CALL npm install 13: 14: REM Invoke gulp, passing in the output folder for the minimized JS scripts 15: CALL node_modules\.bin\gulp --PATH "%TF_BUILD_BINARIESDIRECTORY%\%2" The batch script takes 2 input parameters, which will provide in our Team Build Definition.  The first parameter defines the path to the sources of our Web Application project on the build server.  The second parameter defines the output folder to write our minimized JavaScript file to.  Both input parameters are relative paths and are made absolute based on TFS Build environment variables. The first step in the script (line 6) creates a folder for npm in the user profile – this is a workaround for a bug with the npm installation on Windows. On line 11, the node package manager is then invoked to install all dependencies locally (in the node_modules folder); it takes our package.json file into account for this. After installing the packages, we can now invoke gulp and pass in the output path for the minimized files (using ––PATH). Make sure to commit and push your changes into source control once ready.   Step 4: Setting up a team build The final step to be done is set up our Team Build for our Web Application – this is achieved by creating a new build definition.  The build definition describes the different steps that make up our build process: getting our sources from source control, compilation, invoking our front-end build tasks, and finally copying the output to the drop folder. To create a new build definition, go to Visual Studio, navigate to the Team Explorer and open the Builds section. This view shows the existing build definitions and also shows past build execution results. Create a new build definition by clicking the ‘New Build Definition’ link. In the Trigger tab you can choose to setup a manual build or continuous integration build. To invoke our script for front-end build tasks (clientcompile.bat), we need to configure the build process. Open the ‘Process’ tab and expand section ‘5. Advanced’ inside the Build section. Specifically we’re going to provide the ‘Post-build script’ information. The script will be executed on the build infrastructure after the source code has been compiled, hence post-build. For the script, navigate to the clientcompile.bat file. For the script arguments, we need to provide 2 values, as described above: The relative source code folder for our project: \WebApplication1\WebApplication1 The relative output folder: _PublishedWebsites\WebApplication1 (if you choose ‘SingleFolder’ for output location, Team Build copies all output for a Web Application to the _PublishedWebsites folder on the drop location.   This is the only configuration we need to perform on our Build Definition to invoke our JavaScript build tasks. All we need to do now is queue a new build, or perform a source code modification in case of continuous integration. You can queue a build by right-clicking on the build definition. Once our build has completed, we can check the output of the build by downloading the contents of the drop location. You do so by right-clicking the build result in Visual Studio and choosing ‘Open in Browser’. On the Build output details page you can then download the output of the build.  In addition, you can view a detailed output of the build process in the ‘Diagnostics’ tab.     When you download the build output zip file, you will find the lorem.min.js file inside the js folder. Opening the file in Visual Studio or a text editor shows the minimized contents of the original lorem.js file.   Summary As you can see, using the pre and post build scripts you can very easily integrate custom logic into a Visual Studio Online Team Build process. By adding your front-end build process into your central build/continuous integration process, you can have the same repeatable and controlled process for both server-side and client-side builds. In a next post I’ll explain how you can include front-end build tasks into your Azure Websites continuous deployment process.   Additional Resources Come join us for a Code in the Cloud developer camp to learn more about Visual Studio or attend one of our other developer camps. Check out our free online trainings in Microsoft Virtual Academy Fundamentals of Visual Studio Online Using Git with Visual Studio 2013 Get your free Visual Studio Online account and free Visual Studio Community IDE

Posted by on 5 March 2015 | 10:39 am

Top News from February 2015

The Visual Studio team takes great effort to ensure that our developer community is engaged and informed through various social networks, including channels on Twitter, Facebook, Google+, etc. and as part of the process we gain insights into what you enjoy through your retweets, Facebook likes and shares, and other public opinions of our features and content. Your participation and feedback is essential, as it lets us see what stories you care about. To share our own understanding of your interests, I post a daily “Top 10 Most Active Stories” article over on my blog at lyalin.com. This month we’ve decided to take it one step further and go beyond just the daily data. I’ve reviewed a full month of activity for February 2015 and am posting here the top eleven most active blog posts that were trending. I hope you find this list useful and interesting; and of course we welcome your feedback. Top Trending Blogs for February 2015 .NET CoreCLR is now Open Source. Furthering the .NET open source story, our .NET team blogs that CoreCLR is now Open Source, where they announce the availability of CoreCLR on GitHub. This post also includes a Channel9 chat with the .NET Core team where they discuss CoreCLR and CoreCLR repo, and provides a Console App walkthrough that developers can follow to build .NET console apps. Scott Hanselman Analyzes .NET CoreCLR repo on GitHub using Power BI. Scott Hanselman covers .NET Core open source in his blog post The .NET CoreCLR is now open source, so I ran the GitHub repo through Azure Power BI , where he shares some interesting data. He shows how anyone can use Microsoft Power BI to pull in a GitHub repo, perform BI analytics, and gain insights into several aspects of a project such as who contributes how much to a project, issues found and closed, and others. .NET Open Source Update. Immo Landwerth from the .NET team blogs on .NET Core Open Source Update, where he shares his view on just how great of an experience Open Source projects have been for him and the team. He talks about why we have open sourced .NET Core, and all the collaborative goodness you can use with Code and API reviews. Scott Guthrie Introduces ASP.NET 5. A cloud-optimized, lean, and cross-platform open source web framework “ASP.NET 5” was made available in Feb 2015. Scott Guthrie blogs on Introducing ASP.NET 5, where he goes into the details of each change to ASP.NET. He provides screenshots and code snippets to share the various architectural improvements we have made with this release for example to provide a streamlined development experience we make use of dynamic compilation so you no longer have to compile your application every time you make a change. You simply (1) edit the code, (2) save your changes, (3) refresh the browser, (4) see your changes appear automatically. Understanding .NET 2015. As part of a blog series focused on line of business applications, Beth Massi posted Understanding .NET 2015 in which she walks readers through a high-level overview of recent .NET innovations and the continued voyage into open source. A lot has happened since we released .NET 2015 at the Connect(); event, and this blog post will get you caught up. Free Azure ebook on Fundamentals. The MS Press blog posted on a recently released free ebook: Microsoft Azure Essentials - Fundamentals of Azure. This book covers the fundamentals of Microsoft Azure and shares walkthroughs and examples to help you get started right away. It also discusses common tools useful when creating or managing Azure-based solutions, so it’s a great asset to check out. Free O’Reilly report on Data Science in the cloud. The Microsoft Machine Learning blog posts on a free O'Reilly report on Data Science in the Cloud which shows how developers can use cloud-based tools with existing techniques like R to design Machine Learning models. The report uses practical data science examples, with relevant data sets and R scripts available on GitHub to make it easy to consume. AngularJS in Visual Studio. The Visual Studio team blogs on Using AngularJS in Visual Studio. In this post they talk about how an extension to Visual Studio built by John Bledsoe (in collaboration with Jordan Matthiesen) helps with existing issues for developers working with AngularJS. This post also shares code snippets and details on the various features you can use to make your life easier. Babylon.js 2.0. David Catuhe blogs on What’s new in Babylon.js v2.0. He showcases some amazing audio visual demos, special effects, and performance improvements in this release. For those that don’t know what Bablyon.js is, it’s a 3D engine based on WebGL and JavaScript, which gives developers some interesting opportunities to build 3D renderings or games that work with many modern browsers, certainly worth checking out. Scott Hanselman on how JavaScript has won. Scott Hanselman blogs on JavaScript Has Won: Run Flash with Mozilla Shumway and Develop Silverlight in JS with Fayde. It’s an interesting post that highlights how JavaScript can be used to run Flash and Silverlight apps without Flash and Silverlight. Don’t take my word for it! Check out the post for the how and why of it. Visual Studio 2015 CTP 6 & Team Foundation Server 2015 CTP. Last but definitely not least, many of you were excited to see our blog post on the release of Visual Studio 2015 CTP 6 and Team Foundation Server 2015 CTP release. In this blog post, John Montgomery covers highlights from both of these releases with additional links to posts, articles and docs that talk about each topic in details. Check out this post and download the bits to get started testing the newest release. Feedback I hope you enjoy this list. I will bring you a new one each month, so stay tuned and please don’t hesitate to send me feedback using Twitter or by commenting below. Thank you for reading. Dmitry Lyalin, Sr. Product Manager for Visual Studio (@lyalindotcom) Dmitry has been at Microsoft for over 6 years, working first in Microsoft Consulting and Premier Support out of NYC before joining the Visual Studio Team and moving to Redmond, WA. In his spare time he loves to tinker with code, create apps, and is an avid PC gamer.

Posted by on 5 March 2015 | 10:30 am

TypeScript <3 Angular

Three years ago, we introduced TypeScript, a typed superset of JavaScript for application development at scale offering compile-time type checking and richer tooling integration. Since then, we’ve seen great adoption and usage of TypeScript across a wide range of projects and applications – from Adobe’s Digital Publishing Suite to Mozilla’s Shumway project and the great Asana web application.  At the same time, the ecosystem of frameworks and tools partners around TypeScript has also grown quickly – from tools for Eclipse developed by Palantir to TypeScript support in JetBrains’ WebStorm and the over 700 developers who have contributed to the DefinitelyTyped project. TypeScript + Angular 2 Today, we’re excited to talk about another great framework partner we are working with. For the last several months, the Microsoft TypeScript and Google Angular teams have been working closely together. Today at ng-conf in Salt Lake City, the Angular and the TypeScript teams are unveiling the first fruits of that collaboration.  We’re excited to announce that we have converged the TypeScript and AtScript languages, and that Angular 2, the next version of the popular JavaScript library for building web sites and web apps, will be developed with TypeScript. Working closely with a rich library like Angular has helped TypeScript to evolve additional language features that simplify end-to-end application development, including annotations, a way to add metadata to class declarations for use by dependency injection or compilation directives. Even more than the language innovations and library that have been built by the two teams, I’m proud of the productive relationship and partnership we’ve built between the TypeScript and Angular teams. Both teams are looking forward to continuing to move TypeScript and JavaScript forward together in the future, including working with the ECMAScript standards body on the future of types in JavaScript. Next Steps for TypeScript Since the release of TypeScript 1.0 last year, the TypeScript team has been hard at work making further improvements to the language and tools. As the ECMAScript 6 standard solidifies, we’ve been adding ES6 syntax and features to TypeScript. We’ve also been building user-requested features, such as the ECMAScript 7 async/await to make writing asynchronous code dramatically easier. And we’ve continued to invest in our architecture, making common build tasks up to 4x faster than they were in TypeScript 1.0. You’ll see the results of all this work in the upcoming TypeScript 1.5 and future versions. In addition to the work on the language, we’ve continued to improve Visual Studio’s powerful environment for building TypeScript apps with type-supplemented IntelliSense, go to definition, refactor/rename, project templates to get you started, and integrated build support. If you have Visual Studio 2013 Update 2 or beyond, you have TypeScript already. Conclusion It’s great to see the continued growth in the TypeScript ecosystem, and I’m particularly excited to be partnering with Google’s Angular team to align our work on TypeScript and Angular 2. You can learn more about today’s Angular announcements and keep up with TypeScript on GitHub . Namaste!  

Posted by on 5 March 2015 | 10:30 am

Halbzeit in Amsterdam – Zuhören ist angesagt

„Wie schnell die Zeit vergeht.“  bekommen die meisten Menschen der ja eher von den eigenen Eltern oder Großeltern zu hören. Doch irgendwie überkam viele FY14-MACHs ein Gefühl der Nostalgie als wir am 13.01./14.01. den zweiten großen internationalen Workshop in Amsterdam antraten. Denn exakt eine Woche vorher im Jahr 2014 starte das MACH-Programm für uns. Innerhalb diesen Jahres haben wir extrem viele Dinge erlebt: Sei es das geniale soziales Projekt, MGX mit 15.000 Mitarbeitern aus der ganzen Welt, die vielen unterschiedlichen Messe- und Events (um Euch für Microsoft zu begeisternJ) oder die 346387 Biergarten- bzw. Restaurantbesuche. Einige Kolleginnen und Kollegen sind in der Zeit zu Freunden geworden. Nun jährt sich also MACH FY14 in Amsterdam - Viele der heutigen Freunde waren bereits gemeinsam das vorhergehende Wochenende angereist um die Stadt zu erleben. Neben dem klassischen Stadtführungen und Einkaufstouren kamen abendliche Kneipentouren sowie Restaurantbesuche nicht zu kurz. Eine sehr schöne und lebenswerte Stadt in der Ihr 24x7 Burger und Pommes aus dem Automaten kaufen könnt.      Beim ersten Tag des Trainings „Accelerate Results“ waren die Pausen oft mit Gesprächen gefüllt wie: „Hi! I know you from MXG. How have you been? “ Bei MACHs aus 60 unterschiedlichen Nationen begegneten sich einige bekannte Gesichter. Neben Spaß am Wiedersehen konnten wir viele Inhalte des Trainings auf unsere tägliche Arbeit reflektieren. Denn über das Thema „emotionale Intelligenz und Zuhören“ zu lernen war für alle Jobrollen interessant. Neben unterschiedlichen Diskussionsrunden und einem psychologischen Test waren aktive Vorstellungen der Gruppenarbeiten gefragt. Die unterschiedlichen Typen von Zuhörern wurden diskutiert und vom Trainer Hilfestellung in der Gruppe gegeben. Diskussionen zu zweit / zu dritt ergaben die Chance einzelnen Kolleginnen und Kollegen aus anderen MACH-Programmen nochmals besser kennen zu lernen.     Der erste Tag wurde mit einem kleinen Cocktail-Empfang beendet, um danach am Abend individuell die grandiose Stadt Amsterdam zu erkunden.     Natürlich haben wir bei unseren Trainings nicht nur mit unseren Kollegen aus den anderen Subs Kontakte geknüpft, sondern auch an unseren „Soft Skills“ gearbeitet. Der Fokus des gesamten Accelerate Results Trainings lag wie bereits beschrieben auf „Emotional Intelligence“. Über eine Methode zur Selbsteinschätzung konnten wir konkrete Punkte ausarbeiten, die unser Verhalten kennzeichnen und die wir entweder stärker oder weniger stark einsetzen können. Am zweiten Tag haben die Coaches entschieden eher einen praktischen Ansatz zu wählen und uns in Teams aufgeteilt, mit denen wir dann „Rube-Goldberg-Maschinen“ bauen sollten. „Eine Rube-Goldberg-Maschine ist eine Nonesens-Maschine, die eine bestimmte Aufgabe absichtlich in zahlreichen unnötigen und komplizierten Einzelschriften ausführt. Dies hat keinerlei praktischen Nutzen, sondern soll bei der Beobachtung Vergnügen bereiten.“ – das ist zumindest die Beschreibung die man auf Wikipedia findet. J Zum Großteil haben wir also den zweiten Tag damit verbracht, Ping-Pong Bälle & Kugeln durch den Raum zu katapultieren, um Reihen von Dominosteine zu Fall zu bringen, die wiederum Spielzeugautos zum Fahren gebracht haben, mit Flaschenzügen auf Anhöhen transportiert wurden, wo das Ganze dann von vorne losging um letztlich Kartenhäuser zu Fall zu bringen… so oder auch ganz anders sahen dann die Resultat unserer Rube-Goldberg Teamarbeit aus. Unsere Emotionale Intelligenz haben wir in fünf Blöcken evaluiert, um dann für uns persönlich Fazits zu ziehen die wir in den Arbeitsalltag mitnehmen können: Selbsterkenntnis, Selbstregulierung, Motivation, Empathie, soziale Kompetenz.  Auch das Thema Networking wurde in Amsterdam stark aufgegriffen und wir haben zusammen eine Struktur für unsere Netzwerke ausgearbeitet, die zwischen Close Network, Extended Network und strategischen Kontakten unterscheidet. Unser eigenes Netzwerk auszubauen ist allerdings nur der erste Schritt, viel schwieriger ist es anschließend die Kontakte aufrechtzuerhalten, insbesondere in einer Firma die sich über die ganze Welt erstreckt. Wie wir das schaffen können? Indem wir gemeinsame Erlebnisse außerhalb unseres Arbeitsalltags planen an die wir uns Jahre später noch erinnern können. MGX & unser zweites Treffen ins Amsterdam haben sicherlich den richtigen Rahmen hierfür geschaffen. Eine überraschende Erkenntnis war für uns alle das die persönliche Performance im Job – das heißt erreichte Ziele oder Targets etc. nur etwa 10% des eigenen Erfolgs ausmachen. Die Reputation & das Bild das die Kollegen und Manager von einem haben zählen mit 30% in unsere Erfolge ein, während „Exposure“ – also die Visibilität die wir für unsere Projekte bei unserem eigenen & anderen Teams und in der Firma schaffen mit 60% am meisten dazu beiträgt das unsere Arbeit als besondere Leistung gesehen wird. Beim Thema zuhören ist besonders eine Sache hängen geblieben: „Aim to be interested not interesting“. Wie oft denkt man während man etwas erzählt bekommt bereits bevor der andere überhaupt seinen Satz zu Ende gebracht hat, über die Story nach die man gleich hinzufügen möchte, oder das Argument was man als nächstes bringt. Ein guter Tipp von unserem Coach Troy war uns zu raten einfach mal abzuschalten und uns auf das zu konzentrieren was der andere uns sagen möchte, anstatt sofort über die eigenen Erlebnisse nachzudenken. Fazit - In einer kurzen Pause vom Arbeitsalltag konnten wir als MACH FY14 einiges über emotionale Intelligenz lernen und gleichzeitig ist der Spaß nicht zu kurz gekommen. Stay tuned for more to come! Lena & Moritz        

Posted by on 5 March 2015 | 10:03 am

Angular 2: Built on TypeScript

We're excited to unveil the result of a months-long partnership with the Angular team. This partnership has been very productive and rewarding experience for us, and as part of this collaboration, we're happy to announce that Angular 2 will now be built with TypeScript.  We're looking forward to seeing what people will be able to do with these new tools and continuing to work with the Angular team to improve the experience for Angular developers. The first fruits of this collaboration will be in the upcoming TypeScript 1.5 release. We have worked with the Angular team to design a set of new features that will help you develop cleaner code when working with dynamic libraries like Angular 2, including a new way to annotate class declarations with metadata.  Library and application developers can use these metadata annotations to cleanly separate code from information about the code, such as configuration information or conditional compilation checks.   We've also added a way to retrieve type information at runtime.  When enabled, this will enable developers to do a simple type introspection.  To verify code correctness with additional runtime checks.  It also enables libraries like Angular to use type information to set up dependency injection based on the types themselves. TodoMVC for Angular 2 in TypeScript At ng-conf, we are previewing this work by showing a TodoMVC example, based on David East’s Angular 2 TodoMVC.  You can try this example out for yourself. If you’re new to TypeScript, you can also learn TypeScript through our interactive playground. We’d love to hear your feedback. TypeScript autocomplete in Sublime 3 for Angular 2 We’re looking forward to releasing a beta of TypeScript 1.5 in the coming weeks, and along with it, growing TypeScript’s tooling support to include more development styles and environments.  We'd also like to give a huge thanks to Brad, Igor, Miško on the Angular team for being great partners.  Special shout out to Yehuda Katz, who helped us design the annotation+decorator proposal which helped make this work possible. 

Posted by on 5 March 2015 | 10:00 am

From Data to Insights to Action at the PASS Business Analytics Conference

Are you a key member of the team delivering business analytics and intelligence for your organization? Finding BA/BI to be a more significant part of your day-to-day work life? Not your role yet, but you have a friend or colleague working in business and data analytics? For anyone trying to stay ahead of the analytics curve and position your data career for success, we invite you to join us and the Microsoft BI, Azure, and Office teams at the PASS Business Analytics Conference in Santa Clara April 20-22 – and we also want you to spread the word! Featuring 60+ in-depth sessions, hands-on labs, and expert panels, the conference will cover the essential skills and latest best practices for integrating, analyzing, visualizing, and reporting on company data, as well as unlocking the power and promise of Big Data. Power BI will be front and center at the conference. Enjoy deep dives into Power BI’s dashboard capabilities, out-of-the-box solutions for services such as Salesforce, Marketo and CRM Dynamics, the Designer, SSAS connectivity, Mobile apps, Q&A and much more. Our top experts and other industry leaders will be going inside Power BI, Excel, data visualization, predictive analytics, using SQL and R, and sharing how to get the most from your business data and make better data-driven decisions. Here are some of our sessions you won’t want to miss: Best Practice Mobile Dashboard Design  - Jen Underwood Pour Some Data on Me: Discover, Manage, Analyze, and Visualize Your Data with Power BI – Miguel Martinez Advanced Modelling and Calculation Using the Power BI Designer- Kasper de Jonge Data Hunters & Gatherers: Discover, Acquire, and Transform Your Data with Power Query - Miguel Llopis Instant Dashboards for Your Critical Business Applications: Dynamics, Salesforce, Marketo and more! - Theresa Palmer-Boroski & Faisal Mohamood =Excel(): Your Most Personal Data Analysis Tool - Ashvini Sharma & David Gainer Find the Information You Need from Your Data with Natural Language - Adam Wilson Data-Driven Healthcare: Improving the Quality and Effectiveness of Care Through Analytics – Tom Lawry Plus, watch for new sessions coming soon from SQL Server BI & Data Warehousing Senior Technical Product Marketing Manager Sanjay Soni, Azure HDInsight’s Asad Khan, Power Map’s Igor Peev, and Microsoft Finance Director Marc Reguera. We’re also excited about community sessions by BI experts Chris Webb, Jen Stirrup, and Marco Russo, Big Data and Analytics leaders David Smith, Lynn Langit, and Andrew Brust, and the Excel Dream Team of MrExcel Bill Jelen, Chandoo, PowerPivotPro’s Rob Collie and Avi Singh, Ken Puls, Matt Allington, Excel TV’s Jordan Goldmeier, Rick Grantham, and Oz du Soleil, and more. Come join us – and tell all your analytics colleagues about this incredible time of learning and networking for people who work every day to make sense of and gain insights from data. See you there! PS: You can save $200 by March 16 and an extra $150 by contacting your PASS Local or Virtual Chapter for a discount code to use when registering. Try what's next for Power BI Sign up for the Power BI Newsletter

Posted by on 5 March 2015 | 10:00 am

ReplTip – Bulk Updates with Transactional Replication

ReplTip – Bulk Updates with Transactional Replication by Taiyeb Zakir We’re seeing several customers performing bulk updates on the Publisher. Consider for example a table with 10 rows and 2 columns C1 and C2 and we run this update. Update table set C1 = “Something” This updated 10 rows as one set, but when Log Reader writes to the Distribution database it will be written as 10 individual updates. Now lets say you update a million rows, the Log Reader again writes 1 million entries in the Distribution Database. This will take a long time for the Distribution Agent to apply on the Subscriber. To avoid writing that many entries to the Distribution database you can update the table using a stored procedure and replicate execution of the stored procedure. Now Log Reader will just replicate “Exec MyUpdateProc” statement to run on the Subscriber instead of the individual update statements. This will greatly reduce the size of the Distribution database and will also reduce the latency. Check this article for more details: http://msdn.microsoft.com/en-us/library/ms152754.aspx .

Posted by on 5 March 2015 | 9:35 am

Open ALM: Lösungen mit TFS und VSO integrieren

Open ALM - Das Ziel ist, eine Plattform zu schaffen, welche über Industriestandards einfach erweiterbar ist und somit die Integration anderer Tools und Plattformen ermöglicht. Zu diesem Zweck haben wir, Ulli Stirnweiß (Lead der .NET User Group Franken) und Karsten Kempe (MVP für Visual Studio ALM), das Thema REST API’s, OAuth 2.0 und Service Hooks mit VSO und TFS 2015 näher beleuchtet. Es gibt schon eine ganze Reihe von VSO Integrationen, die auf diesen Industriestandards aufsetzen - eine Übersicht gibt es hier. Unter folgendem Link findet Ihr ein Video zum Einstieg und Blog Posts zu den VSO REST API’s, OAuth 2.0 und Service Hooks. Chris

Posted by on 5 March 2015 | 9:35 am

Modifying the CS_NOCLOSE style does affect all windows of the class, just not necessarily in an immediately noticeable way

In a discussion of how not to disable the Close button, Rick C claims that changing the style does not affect windows that are already created. Actually, it does. You can't see it, but the effect is there. Take our scratch program and make these changes: DWORD CALLBACK NewThread(void *) { CreateWindow( TEXT("Scratch"), TEXT("Scratch 2"), WS_VISIBLE | WS_OVERLAPPEDWINDOW, CW_USEDEFAULT, CW_USEDEFAULT, CW_USEDEFAULT, CW_USEDEFAULT, NULL, NULL, g_hinst, 0); MSG msg; while (GetMessage(&msg, NULL, 0, 0)) { TranslateMessage(&msg); DispatchMessage(&msg); } return 0; } void OnChar(HWND hwnd, TCHAR ch, int cRepeat) { DWORD id; switch (ch) { case ' ': SetClassLong(hwnd, GCL_STYLE, GetClassLong(hwnd, GCL_STYLE) ^ CS_NOCLOSE); break; case '+': CloseHandle(CreateThread(0, 0, NewThread, 0, 0, &id)); break; } } HANDLE_MSG(hwnd, WM_CHAR, OnChar); Run this program, hit the + to open another window, then hit the space bar to set the CS_NOCLOSE style. The window that is passed to Set­Class­Long updates its close button, but the other window does not. But this is purely a visual artifact. If you try to click on the close button of either window, it will not work. So don't change the CS_NO­CLOSE style thinking that it affects just your window. It actually affects all windows of the class. But it may not look that way at a casual glance.

Posted by on 5 March 2015 | 9:00 am