The Microsoft Media Platform Player Framework Brings You a Seamless Streaming Media Experience

If you took in any of the Olympic Games in Sochi this year, it’s more than likely that you watched them online. In fact, as we talked about at Build, more than 2.1 million people watched online coverage of the Sochi Olympics, making it the single largest commercial broadcast ever. At 2014 NAB, Microsoft Azure Media Services announced a limited public preview of the tools and services that made it possible to stream coverage of the Olympics and other sporting events, such as NASCAR, on such a large scale. As you might imagine, streaming high-quality content like the Olympics to millions of PCs and mobile devices is a challenge on multiple levels. I wanted to take a behind-the-scenes look at how they served up streaming content to such a large number of people, so I spoke with a couple of people on my team, Mike Downey and Jit Ghosh. Since the 2002 games in Salt Lake City, the amount of Web-based content for the games has dramatically increased. And with the 2008 Olympic Games in Beijing, Microsoft started working with the NBC Sports Group to power its online viewing experience, using Media Services. The partnership has since expanded to include programming like the Super Bowl and NBC Sunday Night Football. For this year’s games in Sochi, NBC decided to stream live coverage of all 98 events. If you missed Rick’s comments on stage at Build, check this out. That brings me back to Mike and Jit, both of whom have been leading the charge to develop the Player Framework and related SDKs that NBC and other companies are using to stream content. We just released a public beta of the latest version of the Player Framework, which has been updated with support for Java Script and XAML-based Windows Phone 8.1 and universal apps. When you boil it down, on the back end there are really three main challenges to overcome: funneling all of these feeds of high-quality live video and audio through iStreamPlanet’s Aventus private cloud, which is hosted on Azure, encoding it into as many as a dozen formats or bit rates for a variety of different devices, and then streaming it out to different content providers around the world. On the front end you have to create a viewing experience that, while appearing simple, has functionality built into it that keeps the audience engaged and helps content providers pay the bills — things like ad insertion and the telemetry needed to display and track the effectiveness of ads, the ability to surface related videos or ancillary content like interactive timelines or game scores, and closed captioning. In addition, you have to enable the ability to dynamically change the bitrate a player is receiving to account for fluctuations in the quality of the Internet connection. Thanks to a lot of hard work from Jit, Mike and others, we now have a Player Framework that does all of that, works on multiple platforms and is available under open source. Jit put together a demo that shows how easy it can be, so be sure to check that out. You can get more information on the Player Framework homepage.  

Posted by on 24 April 2014 | 2:30 pm

Como crear y/o utilizar el Charm de Settings en WinRT | C# | XAML

Intermedio El charm de setting es una de esas cosas que en mi opinión aún están por afinar en WinRT, es tan necesario y común su forma de uso que siempre me pregunto porque no viene el problema ya resuelto. Muchos de ustedes, y me incluyo, seguro ...read more...(read more)

Posted by on 24 April 2014 | 2:27 pm

Get the latest version of Internet Explorer

Microsoft released an updated version of Internet Explorer this month, and it’s available as a free download on Windows 8.1, Windows 7, and Windows Phone 8.1. To increase your security and privacy, it’s important that you use the latest version of any software, but especially your web browser. This new version of Internet Explorer also includes new features that make it easier to browse the web on a variety of devices. Learn more at the Internet Explorer blog. If you have automatic updating turned on, you already have the latest version of Internet Explorer. Learn how to get updates like this one, as well as security updates for all your Microsoft software automatically.

Posted by on 24 April 2014 | 2:12 pm

Automating VM Customization tasks using Custom Script Extension

One of the VM Extensions that was launched during /build conference is an extension called ‘Custom Script Extension’ which was released with PowerShell support. If this is the first time you are visiting these blogs, you might want to check the previous blogs here and here to get an overview of the VM Agent and Extensions.  So, what does this extension do and how is it different from Remote PowerShell or any other existing remote execution tools?  In a nutshell, Custom Script Extension can automatically download scripts and files from Azure Storage and launch a PowerShell script on the VM which in turn can install additional software components. And just like with any other VM Extension, this can be added during VM creation or after the VM has been running. Sounds simple, yet very useful! Pre-Requisites for running PowerShell Scripts using this Extension: Install Azure PowerShell Cmdlets V0.8.0 or above from here.  If the scripts will be run on an existing VM, make sure VM Agent is enabled on the VM,  if not follow this blog post to install one. Upload the scripts that you want to run on the VM to Azure Storage. The scripts can come from a single or multiple storage containers. The script should be authored in such a way that the entry script which is launched by the extension in turn launches other scripts. Now that you have the necessary pre-requisites, let’s walk through a few use cases to show how these cmdlets can be used. The current version of this extension targets PowerShell cmdlets only but in future this can be expanded to other scripts. Use Case 1:  Uploading files to a container in the default account. If you have your scripts in the storage container of the default account of your subscription, then the cmdlet snippet below shows how you can run them on the VM. The ContainerName in the sample below is where you upload the scripts to. The default storage account can be verified by using the cmdlet ‘Get-AzureSubscription –Default’ Note: This use case creates a new VM but the same operations can be done on an existing VM as well.  Use Case 2:  Using non-default storage accounts. This use case shows how to use a non-default storage either within the same subscription or in a different subscription for uploading scripts/files. Here we’ll use an existing VM but the same operations can be done while creating a new VM. Use Case 3: Uploading scripts to multiple containers across different storage accounts. If the script files are stored across multiple containers, then currently to run those scripts, you have to provide the full SAS URL of these files. SAS URLs generated can be generated by using a tool like Azure Storage Explorer.   Hopefully these cmdlets will help you automate some of your VM Customization tasks. We’d love to hear from you on what other capabilities you’d like to see in this extension in its upcoming releases.  Please post your comments/questions to here.

Posted by on 24 April 2014 | 2:00 pm

Logging with the .NET backend for Azure Mobile Services

On the node.js version of mobile services, logging was done mostly via the node’s console object. In any of the server scripts you could call console.[log|warn|error] and when that script was executed you could go to the “logs” tab in the portal to see them. In the .NET backend you still have the same capabilities, but there are some new features and quirks which one should be aware to get the best logging experience in that new backend. Let’s go over them here. Logging when running locally This is a very simple example of logging in the .NET backend. If we write this controller class, we can use the Log property in the Services member of the TableController<T> class to write to the Azure logs. publicclassTodoItemController : TableController<TodoItem> {     protectedoverridevoid Initialize(HttpControllerContext controllerContext)     {         base.Initialize(controllerContext);         var context = newMyContext(this.Services.Settings.Schema);         this.DomainManager = newEntityDomainManager<TodoItem>(context, this.Request, this.Services);     }       publicTask<TodoItem> PostItem(TodoItem item)     {         this.Services.Log.Info("[PostItem] Inserting new item: " + item.Id ?? "<<no id>>");         returnbase.InsertAsync(item);     } } When we publish the application to Azure and insert an item, we can see the log in the “logs” tab in the azure portal. However, when we’re running the backend locally, there is no portal where we can see the logs, so where can we find them. As one user in Stack Overflow had this question, I imagine that others may have the same problem. And there’s nothing really fancy here – the logging is implemented (by default, see more advanced topics later) using the standard System.Diagnostics tracing system, so we can add a new listener in the web.config and send the log output to some file where you can read: <system.diagnostics>   <traceautoflush="true">     <listeners>       <addname="default"            initializeData="c:\temp\mylog.txt"            type="System.Diagnostics.TextWriterTraceListener" />     </listeners>   </trace> </system.diagnostics> If we send the same request to the local service, we can now find it out the traces. iisexpress.exe Information: 0 : Message='[PostItem] Inserting new item: 01234567-89ab-cdef-0123-456789abcdef-1', Id=00000000-0000-0000-0000-000000000000, Category='PostItem' Another alternative, if you’re running the local .NET backend under the debugger, you can look at the “Output” window, selecting “Show output from: Debug”, and the traces will be shown in the window. If that’s the case you don’t even need to add a trace listener, as when running under the debugger VS will add one listener by default that writes to its output/debug window. There will be some noise on other information which is traced to the debug window, but that may be an acceptable solution as well. The traces does not have the same format as we find in the portal, but that’s a quick way to get the information you want. Customizing traces But what if you really want a better formatted traces? Not a problem. The .NET backend uses the same system as its underlying framework, ASP.NET Web API. All traces are written via the ITraceWriter interface, and you can supply your own implementation of that interface, like in the code below, which uses a format very similar to the one displayed in the portal. The “category” parameter by default has the name of the member (typically method or property) where the log method was called, but it can be customized for anything that makes sense for your application. publicclassMyTraceWriter : ITraceWriter {     publicvoid Trace(HttpRequestMessage request, string category, TraceLevel level, Action<TraceRecord> traceAction)     {         var record = newTraceRecord(request, category, level);         traceAction(record);         System.Diagnostics.Trace.WriteLine(string.Format("{0} - {1} - {2} - {3}",             record.Level, record.Message, record.Category, record.Timestamp.ToString("ddd MMM dd yyyy, HH:mm:ss tt")));     } } And to use it we can replace the trace writer service with our own implementation in the HttpConfiguration object returned by the ServiceConfig.Initialize method. publicstaticclassWebApiConfig {     publicstaticvoid Register()     {         var configBuilder = newConfigBuilder(configOptions);         var config = ServiceConfig.Initialize(newConfigBuilder());         config.Services.Replace(typeof(ITraceWriter), newMyTraceWriter());     } } The document Tracing in ASP.NET Web API has a good introduction on the tracing system used by Web API (and the .NET backend of the mobile service). Now, everything works fine when we run locally, but if we publish the service to Azure and invoke the operations in the controller again, we won’t see the traces in the portal anymore. The problem is that while we’re tracing to the System.Diagnostic subsystem, that is not hooked up to the tracing in the service running in Azure. We can fix that by passing the original trace writer (which can trace within Azure) to our trace writer implementation: publicclassMyTraceWriter : ITraceWriter {     ITraceWriter original;       public MyTraceWriter(ITraceWriter original)     {         this.original = original;     }       publicvoid Trace(HttpRequestMessage request, string category, TraceLevel level, Action<TraceRecord> traceAction)     {         if (this.original != null)         {             this.original.Trace(request, category, level, traceAction);         }           var record = newTraceRecord(request, category, level);         traceAction(record);         System.Diagnostics.Trace.WriteLine(string.Format("{0} - {1} - {2} - {3}",             record.Level, record.Message, record.Category, record.Timestamp.ToString("ddd MMM dd yyyy, HH:mm:ss tt")));     } } And when creating our trace writer we can pass the original writer if the service is running in Azure (which you can access via the extension method GetIsHosted – you’ll need to add a "using System.Web.Http" statement if you don’t have already to see that method). publicstaticclassWebApiConfig {     publicstaticvoid Register()     {         var config = ServiceConfig.Initialize(newConfigBuilder());         ITraceWriter originalTraceWriter = null;         if (config.GetIsHosted())         {             originalTraceWriter = (ITraceWriter)config.Services.GetService(typeof(ITraceWriter));         }           config.Services.Replace(typeof(ITraceWriter), newMyTraceWriter(originalTraceWriter));     } } Notice that for the case of a better local logging you could also only replace the trace writer if the service was running locally (i.e., if it was not hosted). But that organization will be useful in the next section. Changing logs destination Looking at the traces in the portal is fine during development, but as the service goes into production and if your mobile application is successful, you can expect thousands of traces being generated, and going through them via the portal isn’t the ideal way to analyze your logs. But since we can own completely the logging implementation, nothing prevents us from redirecting the logs to some external source – for example, table storage, where you can fetch and analyze them easier with several of the existing tools. Let’s change our trace listener implementation to send the logs to a table in Azure Storage if the service is running in the cloud. publicclassMyTraceWriter : ITraceWriter {     ITraceWriter original;     string mobileServiceName;     string storageConnectionString;       public MyTraceWriter(ITraceWriter original, string mobileServiceName, string storageConnectionString)     {         this.original = original;         this.mobileServiceName = mobileServiceName;         this.storageConnectionString = storageConnectionString;     }       publicvoid Trace(HttpRequestMessage request, string category, TraceLevel level, Action<TraceRecord> traceAction)     {         var record = newTraceRecord(request, category, level);         traceAction(record);         if (this.original == null)         {             // Running locally             System.Diagnostics.Trace.WriteLine(string.Format("{0} - {1} - {2} - {3}",                 record.Level, record.Message, record.Category, record.Timestamp.ToString("ddd MMM dd yyyy, HH:mm:ss tt")));             return;         }           this.original.Trace(request, category, level, traceAction);         TraceToTableStorage(record);     }       privatevoid TraceToTableStorage(TraceRecord record)     {         CloudStorageAccount storageAccount = CloudStorageAccount.Parse(this.storageConnectionString);         CloudTableClient tableClient = storageAccount.CreateCloudTableClient();         CloudTable table = tableClient.GetTableReference("mylogs");         table.CreateIfNotExists();         TableOperation operation = TableOperation.Insert(newTableTraceRecord(mobileServiceName, record));         table.Execute(operation);     } } You’ll need to add a reference to the “Windows Azure Storage” package if you haven’t done it yet to be able to use the classes shown in the code above. The TableTraceRecord class is defined as a subclass of TableEntity, which is required to insert items into the tables. In this example I’m using the timestamp as the row key to make it easier to sort based on the trace order (and adding a random set of characters to disambiguate in case multiple trace requests come at the exact same instant). publicclassTableTraceRecord : TableEntity {     conststring DateTimeFormat = "yyyy-MM-dd'T'HH:mm:ss.fffffff'Z'";       publicstring Message { get; set; }       publicstring Category { get; set; }       publicstring Level { get; set; }       public TableTraceRecord(string mobileServiceName, TraceRecord traceRecord)         : base(partitionKey: mobileServiceName, rowKey: CreateRowKey(traceRecord.Timestamp))     {         this.Category = traceRecord.Category;         this.Level = traceRecord.Level.ToString();         this.Message = traceRecord.Message;     }       privatestaticstring CreateRowKey(DateTime dateTime)     {         string disambiguator = Guid.NewGuid().ToString("N").Substring(0, 8); // in case two entries have same timestamp         returnstring.Format("{0}-{1}",             dateTime.ToUniversalTime().ToString(DateTimeFormat, CultureInfo.InvariantCulture),             disambiguator);     } } We now need to pass the connection string and mobile service name to the trace writer class. The recommended way to store the connection string is via the app settings, so I’ll add the connection string fro my account there: And on the Register method we retrieve the values we need from the application settings.   publicstaticclassWebApiConfig {     publicstaticvoid Register()     {         var config = ServiceConfig.Initialize(newConfigBuilder());         ITraceWriter originalTraceWriter = null;         if (config.GetIsHosted())         {             originalTraceWriter = (ITraceWriter)config.Services.GetService(typeof(ITraceWriter));         }           var mobileServiceName = ConfigurationManager.AppSettings[ServiceSettingsKeys.ServiceName];         var storageConnectionString = ConfigurationManager.AppSettings["MyTableStorageConnString"];         var myTraceWriter = newMyTraceWriter(originalTraceWriter, mobileServiceName, storageConnectionString);         config.Services.Replace(typeof(ITraceWriter), myTraceWriter);         config.SetIsHosted(true);     } } Now when the trace is executed, it will write to the table storage. More logging please! So far we’re looking at how to look at the logs which our application writes out. But the .NET backend also exposes other kinds of logs which you can see as well. When we’re initializing the service configuration, we can pass a set of configuration options. One of them is the list of trace categories which are excluded from the traces. By default, internal traces are excluded, but we can remove those by clearing the ExcludedTraceCategories list in the config options passed to the initializer, as shown below. publicstaticclassWebApiConfig {     publicstaticvoid Register()     {         var configOptions = newConfigOptions();         configOptions.ExcludedTraceCategories.Clear();         var configBuilder = newConfigBuilder(configOptions);         var config = ServiceConfig.Initialize(configBuilder);         // ...     } } Try it out – you’ll see more information about the inner workings of the backend. Definitely some information that the majority of the users won’t need, but a good way to get a better view of what’s happening behind the scenes.   Wrapping up I hope this post will help you understand how to better use the logging subsystems in the .NET backend for mobile services. As usual, feel free to submit questions or comments to this post, the MSDN forums or contact us via Twitter @AzureMobile.

Posted by on 24 April 2014 | 2:00 pm

Windows Phone 8.1 for Developers–Using the Credential Locker

This blog post is part of a series about how Windows Phone 8.1 affects developers. This blog post talks about how to use the credential locker to store sensitive data and is written by Johan Silfversparre at Jayway and was originally posted here. Intro With the release of Windows Phone 8.1, the Credential Locker service is now available for Windows Phone app developers. This is great news for both first time users and those of us already using the service in our Windows 8.1 Store apps. This Credential Locker service simplifies the task of handling user credentials and to store them securely encrypted on the device your app is running. It also roam the credentials between devices along with the user Microsoft account. Information is stored in the Credential Locker per user and cannot be shared between apps.   When A common scenario for using the Credential Locker is if your app connect to services like social networking. By only asking the user for login information once and store it in the Credential Locker between sessions will provide a better user experience.   How to Create a reference: using Windows.Security.Credentials … // public PasswordVault() var vault = new.PasswordVault(); Store user credential method: // public void Add(PasswordCredential credential) vault.Add(new PasswordCredential("resourceName", “username”, “password”)); Retrieve list of all user credentials: // public IReadOnlyList RetrieveAll() Var list=vault.RetrieveAll(); Retrieve list of user credentials by user name: // public IReadOnlyList FindAllByUserName(string userName) var list = vault.FindAllByUserName(“username”); Retrieve list of user credentials by resource name: // public IReadOnlyList FindAllByResource(string resource) var list = vault.FindAllByResource(“resourceName”); Retrieve specific user credential by username and resource: // public PasswordCredential Retrieve(string resource,string username) var list = vault.Retrieve(“resourceName”, “username”); Delete user credential: // public void Remove(PasswordCredential credential) vault.Remove(new PasswordCredential("resourceName, userName, password)); Best practices Use the Credential Locker to store passwords, not large data blocks. Make sure the user has successfully signed in and opted to save passwords before storing them in the Credential Locker.   Summary Another option for handling user authentication is now available for you as a Windows Phone app developer. The Credential Locker service is introduced and you use it in the same way as in your Windows 8 apps.

Posted by on 24 April 2014 | 1:56 pm

Windows Phone 8.1 for Developers–The Background Media Player

Posted by on 24 April 2014 | 1:51 pm

Visio: Using Visio to do design for the Arduino

I am putting together a design template for the Arduino in Visio.  But I did notice that the stencils are kind of a mess.  Seriously, take a look it is pretty funny… Semiconductors and Electron Tubes?  Ok, I admit that I did do some design with vacuum tubes in my teens, but mostly for high output (for that time) transmitters and a few radio receivers just for class labs, and then we moved into transistor design, never looking  back.  And yes that was in the LATE 1960s. ...(read more)

Posted by on 24 April 2014 | 1:51 pm

Windows Phone 8.1 for Developers–Application Data

This blog post is part of a series about how Windows Phone 8.1 affects developers. This blog post talks about how to store application  data and is written by Robert Hedgate at Jayway and was originally posted here. Introduction In this blog post I will do a quick overview of what is new with application data in Windows Phone 8.1. This is the first impression of the functions so I do recommend you to take a look at MSDN if you want a deeper understanding of how it works.   What is the same as before Saving data to local storage works as before. Here is a code example of how it could look like: var localFolder = ApplicationData.Current.LocalFolder; var localFile = await localFolder.CreateFileAsync("localFile.txt", CreationCollisionOption.ReplaceExisting); var fileBytes = System.Text.Encoding.UTF8.GetBytes("some text string"); using (var s = await localFile.OpenStreamForWriteAsync()) { s.Write(fileBytes, 0, fileBytes.Length); } What is new Even before you could instead of targeting LocalFolder target RoamingFolder or TemporaryFolder. This code did compile however it did throw a not implemented exception when you ran the code. In Windows Phone 8.1 it is now implemented. The code structure is the same as before, just change what folder you’re targeting. This is exactly as it works on Windows 8.1. Another new function which is very useful is the FileIO.WriteTextAsync. This makes it very easy to write text to files: var tempFolder = ApplicationData.Current.TemporaryFolder; var tempFile = await roamingFolder.CreateFileAsync("tempFile.txt", CreationCollisionOption.ReplaceExisting); await FileIO.WriteTextAsync(tempFile, "some text string"); TemporaryFolder Why use the TemporaryFolder? This is the place where you can save data without thinking of have to remove it later. The data is saved between sessions but is removed when Windows Phone deem it necessary, e g running low on memory. Saving data between sessions is perhaps not the best idea but TemporaryFolder can be used for example for sharing data between several view when you for some reason don´t want to keep the data in memory. You can save images to the temporary folder for use in your views if you don´t want to persist them in your local folder. From XAML you can target the TemporaryFolder by using ms-appdata:///temp/. Example of image XAML: RoamingFolder This is a very useful function. If you save data to the RoamingFolder the data is available on every device the app is installed, of course only if you log in with the same id. Perhaps it is more useful on Windows Store apps because often you have more than one device. But now when the stores will merge one can roam data between a Windows 8.1 app and a Windows Phone 8.1 app, just set the same app id to both apps in the store. If you want to find out when roaming data has been changed by someone else you need to listen to DataChanged. This fires if roaming data is change by another app: applicationData.DataChanged += DataChangedHandler; private async void DataChangedHandler(ApplicationData appData, object o) { // Add code } RoamingStorageQuota How much data can you roam? By calling RoamingStorageQuota you get the amount of data possible to roam. If you try to roam more data than RoamingStorageQuota specifies the system stop replication the data until it below the limit again. The normal amount of data possible to roam is 100kb. var quota = applicationData.RoamingStorageQuota; Settings The settings saving is also the same as Windows 8.1. Instead of targeting LocalStorage it is possible to target LocalSettings, there is also a RoamingSettings. This works the same as folders does, local saves locally and roaming saves to the cloud. How do settings work then? Quite easy actually, just fetch the settings and save: var roamingSettings = ApplicationData.Current.RoamingSettings; roamingSettings.Values["MySetting"] = "Hello World"; In addition to save text to a setting one can save a composite value. This is a setting which contains several settings: var composite = new ApplicationDataCompositeValue(); composite[settingName1] = 1; composite[settingName2] = "world"; roamingSettings.Values["MyCompositeSetting"] = composite; It is also possible to create containers in the settings. This for making it easier to structure the settings. Example of a container: var container = localSettings.CreateContainer("exampleContainer", ApplicationDataCreateDisposition.Always); if (localSettings.Containers.ContainsKey("exampleContainer")) { localSettings.Containers["MyContainer"].Values["MySetting"] = "Hello Windows"; } If you use RoamingSettings changes can also be detected using the applicationData.DataChanged += DataChangedHandler; Version If you´re saving data and update your app you might need to use versioned data. Versioning enables you to change the application data format used in a future release of your app without causing compatibility problems with previous releases of your app. The app checks the version of the data in the data store, and if it is less than the version the app expects, the app should update the application data to the new format and update the version. The version start at zero and can be obtain by checking var version = ApplicationData.Current.Version; if the version is lower than the expected to some converting and set the version to the correct one ApplicationData.Current.SetVersionAsync(1, SetVersionHandler); Summary It is very nice to have the same data handling as in Windows 8.1. It is now possible to roam, save settings etc. This makes it easier to develop and share code between Windows Phone 8.1 and Windows 8.1 projects.

Posted by on 24 April 2014 | 1:44 pm

Windows Phone 8.1 for Developers–Multitasking & Background Agents

This blog post is part of a series about how Windows Phone 8.1 affects developers. This blog post talks how to mulitask with background agents and is written by Robert Hedgate at Jayway and was originally posted here. Windows phone 8 In windows phone 8 we have something called Background agents and schedule tasks. Working with these API is a bit difficult since there are constraints suck as expiration time etc.   Windows phone 8.1 With windows phone 8.1 we now have the same way of working with multitasking as in Windows 8.1. If however you have invested much time in the old way but still want to use the new 8.1 API:s upgrade to Silverlight 8.1 and continue to use the background agents. You can use the new multitasking in Silverlight 8.1 as well but do not use both version at the same time, it will function poorly due to the operating system will use the same API at the same time.   What’s new then We now have triggers. This is what will trigger the background task to start. There are a lot of trigger e g TimeZoneChange, UserAway, SmsReceived and more. Some of these require the app to be put on the lock screen. Below I show an example of how to use a trigger: Create a Windows Runtime Component. Add a class which inherits from IBackgroundTask. This will create a Run method in your class. This is the method which will be run when the trigger is activated. public sealed class Bg : IBackgroundTask { public void Run(IBackgroundTaskInstance taskInstance) { } } Add this class to the appxmanifest declarations: Then in your code create a builder and add a trigger and register it. const string name = "MyExampleTrigger"; if (BackgroundTaskRegistration.AllTasks.Any(task => task.Value.Name == name)) { // One register it once return; } var builder = new BackgroundTaskBuilder(); var trigger = new SystemTrigger(SystemTriggerType.TimeZoneChange, false); builder.Name = name; builder.TaskEntryPoint = typeof(BackgroundTasks.Bg).FullName; builder.SetTrigger(trigger); var registration = builder.Register(); registration.Completed += RegistrationOnCompleted; RegistrationOnCompleted will be called when the background task is completed. There are also a Progress event to listen to if you want. There are limitation on how much memory, CPU time etc you are allowed to use. To maximize the amount call: var result = await BackgroundExecutionManager.RequestAccessAsync(); if (result == BackgroundAccessStatus.Denied) { // Handle this if it is importet for your app. } If the result is denied the phone thinks it has too much background task active. In that case you can prompt your users to go the Battery saver application and force allow your app to run in the background even if the phone don´t want to. Just ask nice and I’m sure the user will do this for your super app J. What’s completely new In windows phone 8.1 there are some new triggers that just make sense on the phone to have: GattCharacteristicNotificationTrigger (bluetooth) DeviceChangeTrigger DeviceUpdateTrigger RfcommConnectionTrigger. There are however some trigger removed as well compared to window 8.1: OnlineIdConnectedStateChange LockScreenApplicationAdded LockScreenApplicationRemoved ControlChannelTrigger And then again there are some things which are available in Silverlight 8.1 but not in Windows phone 8.1: Continuous Background Location Runs Under Lock VoIP Agents Wallet Agents   Background transfer The old way with using the Microsoft.Phone.BackgroundTransfer namespace there was a lot of limitaions, size, requests etc. Now the phone is using Windows.Networking.BackgroundTransfer the same as windows 8.1 with no size limitation, in progress stream access etc. It does however use the Battery saver and will halt or stop your download/upload if you are near datalimit etc. It is quite easy to set up a download: var downloader = new BackgroundDownloader(); var download = downloader.CreateDownload(source, destinationFile); await download.StartAsync(); It is also possible to add progress and cancel tokens to the download object if it is a long operation. await download.StartAsync().AsTask(cts.Token, progressCallback); I recommend you to look at this very good example http://code.msdn.microsoft.com/windowsapps/Background-Transfer-Sample-d7833f61/ which shows how background worker functions. It is also made as an Universal app which is nice to see how easy it can be. As always there are also a MSDN link http://msdn.microsoft.com/en-us/library/windows/apps/windows.networking.backgroundtransfer.aspx.   Summary: Windows Phone 8.1 now have the same background functions as in Windows 8.1. This is great and makes it super easy to share the code in a Universal app.

Posted by on 24 April 2014 | 1:39 pm

DigiGirlz in AZ

[<a href="http://blogs.msdn.com//storify.com/palermo4/digigirlz-az" target="_blank">View the story "DigiGirlz AZ" on Storify</a>]

Posted by on 24 April 2014 | 1:16 pm

Cuenta de Empresa en el Windows/Windows Phone Store

Resumo aquí algo de información que me preguntan habitualmente empresas que quieren sacar su cuenta para publicar apps de Windows Store o Windows Phone:   En el caso de empresas demora algunos días porque se corre un proceso de autenticación por parte de Symantec: 1.       Por ejemplo para publicar apps de Windows Phone, deben hacerlo en http://dev.windowsphone.com usando una Microsoft account (lo que se conocía como “live ID”). Puede ser una existente corporativa que conviertan en Microsoft account o alguna despersonalizada como apps_MI_EMPRESA@outlook.com. El costo de la subscripción en el Store anual es de 19USD para particulares y USD 99 para empresas y se paga con tarjetas de crédito VISA o MASTER. Si tienen una cuenta de msdn, pueden usar un código que los exime del pago obteniéndolo aquí. 2.       Una vez hecho esto, les va a llegar un email con el order number y comienza el proceso de autenticación donde seguramente Symantec los llame por teléfono dentro de un par de días al número que les figure a ellos en páginas blancas online. En el mismo email hay un link para chatear con la gente de Symantec www.geotrust.com/microsoftmarketplace/chatsupport que les sugiero que usen para de antemano ir validando con Symantec cual es el estado de la autenticación cada algunos días y a que numero van a llamar, etc. En ese período, van a ver que la cuenta está “en validación”.

Posted by on 24 April 2014 | 1:07 pm

A small update to OneNote is available

For those of you running OneNote from the Windows Store, we released a small update yesterday. Mostly stability fixes and a few performance tweaks to lessen the memory we use are some reasons you would want to update. You can get it here: http://apps.microsoft.com/windows/en-us/app/onenote/f022389f-f3a6-417e-ad23-704fbdf57117 if you have an up to date Windows 8.1 machine.  I don’t think we are making a big push around this but I did want to give a little insight on what the update has in it. That last phrase about having an up to date Windows machine actually played a role on the testing for this release. About two weeks before we were done the Windows folks updated the certification test to version 3.3. Our automation system was still using the 3.2 test, so we had to update it to the new version. That took a day or two to get all our images updated to use the new tool and in the meantime I ran the 3.3 version locally. It is a little bit simpler than the old version. For instance, the boot performance test has been removed (in my opinion, it was a pretty simple test and easy to "game" so I can understand why it was removed). Still, it was kind of interesting to watch it start and stop OneNote several times during the testing. Now we are all up to date with automating this test and I should not have to run it manually again. Go try out our update and let us know what you think! Questions, comments, concerns and criticisms always welcome, John

Posted by on 24 April 2014 | 12:59 pm

RMS sharing apps are now updated and localized!

This is a cross-post from our TechNet RMS blog. Hi folks, We’re excited to announce an updated release to the RMS sharing apps on Windows, iOS, Android, and Windows Phone. Shubha Pratiwadibhayankar is a Program Manager on the team and she'll talk about the updates in more detail. Hello, this is Shubha, Program Manager on the RMS team. We have several RMS sharing app updates for you with this release. Below is a summary:  RMS sharing app for Windows We’ve made some great new updates to the RMS sharing app on Windows, which will improve both the user and admin experience. This is a direct result of your comments and feedback. A cool new Outlook add-in that lets you protect and share your attachments right from within Outlook. Updates to the end-user installer – it is now a one-click exe that downloads required files and runs setup. No more zip folders. IT administrators now have a great control on their RMS sharing app deployments: IT admins can now configure levels of protection for different file types for their entire organization. IT admins can also choose to disable automatic update notifications for their organization. Many fit and finish bug fixes coming directly from the bugs you filed.   RMS sharing app for iOS, Android, and Windows Phone These apps now join RMS sharing app for Windows in being fully localized. You'll find the list of languages that we support for these apps below. We also made a few bug fixes to enhance your experience with the applications   Localized language list – RMS sharing app for iOS – Czech, Danish, German, Greek, Spanish, Finnish, French, Croatian, Hungarian, Indonesian, Italian, Japanese, Korean, Malay, Dutch, Norwegian, Polish, Portuguese – Brazil, Portuguese – Portugal, Romanian, Russian, Slovak, Swedish, Thai, Turkish, Vietnamese, Ukrainian, Chinese – Traditional, Chinese – Simplified RMS sharing app for Android – Bulgarian, Czech, Danish, German, Greek, Spanish, Estonian, Finnish, French, Hindi, Croatian, Hungarian, Indonesian, Italian, Japanese, Kazakh, Korean, Lithuanian, Latvian, Malay, Dutch, Norwegian, Polish, Portuguese – Brazil, Portuguese – Portugal, Romanian, Russian, Slovak, Slovenian, Swedish, Thai, Turkish, Vietnamese, Ukrainian, Chinese – Traditional, Chinese – Simplified  RMS sharing app for Windows Phone – Bulgarian, Czech, Danish, German, Greek, Spanish, Estonian, Finnish, French, Hindi, Croatian, Hungarian, Indonesian, Italian, Japanese, Kazakh, Korean, Lithuanian, Latvian, Malay, Dutch, Norwegian, Polish, Portuguese – Brazil, Portuguese – Portugal, Romanian, Russian, Slovak, Slovenian, Serbian, Swedish, Thai, Turkish, Vietnamese, Ukrainian, Chinese – Traditional, Chinese – Simplified   Download You can download the apps here.   Feedback As always, your feedback is welcome, and critical to the success of our apps. Please send us your comments at askipteam@microsoft.com   Thanks, Shubha

Posted by on 24 April 2014 | 12:44 pm

Delivering high quality healthcare—measure, improve, measure again

I continue to be amazed by the magical thinking that is swirling around electronic medical records. On one level, I completely understand it. A hospital or clinic invests tens of thousands, sometimes hundreds of millions, of dollars in electronic medical record software or a hospital information system. With that kind of money on the line, and all of the pain that comes with the implementation and training required, one would want to believe that as a result healthcare delivery has been transformed...(read more)

Posted by on 24 April 2014 | 12:34 pm

Microsoft Exchange Conference (MEC) 2014 - Recordings

As I updated earlier, MEC 2014 concluded in Austin, Texas -  MEC2014 delivered the latest content for Office 365 Exchange Online and Exchange on-premises customers. Content is delivered across tracks including: Architecture; Deployment & Migration; eDiscovery and Compliance; Exchange Extensibility; Manageability and Support; Outlook, OWA, and Mobility; and Security and Protection. Now you can get the recordings of the sessions @ http://channel9.msdn.com/events/mec/2014 . Visit the sessions...(read more)

Posted by on 24 April 2014 | 12:01 pm

Weekly Roundup: Tips and Tricks from Around the World

Join Us Monday April 28th for Live Webcast with Special Guest Chris Finlan from Data Zen. Join us Monday at 12:00 pm Eastern for a special edition of Mid-Day Cafe hosted by the MTC Philadelphia. We will feature special guest Chris Finlan who will be covering/demoing the Business Intelligence capabilities of Data Zen. Dr. Device will also present as well as our weekly coverage of Microsoft tech news and more. The agenda for this week: Broadcast Opening Microsoft Weekly Tech News Device of the Week - Featuring Dr. Device Bits & Bytes - Special Guest Chris Finlan. Datazen is a long time Microsoft partner that provides a powerful data visualization platform that integrates seamlessly with your Microsoft stack to provide the following benefits: Datazen has a mobile first approach and was designed from the ground up to provide cross-platform support and native applications for all mobile devices. The touch based dashboard designer and rich UI experience for the end-user that is centered around the Windows 8 platform The deep integration with all the parts of the Microsoft stack you own and are licensed for - SQL Server, SSAS, SharePoint, Excel, Azure, Windows 8, and Windows Phone The ability to leverage custom map shapes and other visualizations of any type to provide a truly unique end-user experience customized to their industry needs The unique ability to white-label the product and provide customers a fully-branded solution in any app store you choose (iOS, Android, Windows Phone, Windows 8) The freemium licensing model that allows any user of the organization to start using the product with Excel spreadsheets at no cost, with the premium step-up offering against live data sources with several additional features available at a cost savings of hundreds of thousands of dollars a year over competitor solutions like Tableau App of the Week Mailbag Open Q&A Upcoming Events Close To learn more, please visit this site.  

Posted by on 24 April 2014 | 12:00 pm

Evento: Openess Fest en Microsoft México

Registro: http://bit.ly/OpennessFest

Posted by on 24 April 2014 | 11:34 am

We've moved

You may have noticed that the previous post on this blog is from several years ago.  This doesn’t mean the Visual Studio debugger team has stopped blogging about the work we’re doing on your behalf, but we moved our blogging to the Diagnostics channel on the Visual Studio ALM blog. Additionally for a great overview of features we’ve added in Visual Studio 2013 see the following posts on the Visual Studio blog that will point you to the corresponding blog posts: Visual Studio 2013 Diagnostics Investments Enhancements to Debugging and Profiling in Visual Studio 2013 Update 2

Posted by on 24 April 2014 | 11:09 am

Enhancing learning through the cloud

Enhancing learning through the cloud MyCloud, an e-learning platform developed by Microsoft Research Asia, helps primary students optimize the time they spend studying by providing an interactive space in which students and teachers can collaborate, explore, and learn.  ...(read more)

Posted by on 24 April 2014 | 11:00 am

The Internet of Things is here right now

Posted By Barb EdsonGeneral Manager, Marketing and Business Development People are talking more and more about the Internet of Things, but it surprises me how much of the discussion is still coming from a future-focused perspective. While it’s always fun to imagine the future, the Internet of Things is already here today, and we don’t have to rely on our imaginations to see what it can do. It’s not some futuristic trend, but a real-world technology paradigm that is making a difference to businesses right now. As the tech world continues to focus on the Internet of Things, you’ll keep seeing discussions of billions of devices and vast volumes of data. That can feel overwhelming, but in practical terms, most of us really aren’t looking at billions of devices. You should be focused on the devices within your organization: at a few hundred point-of-sale terminals across your large retail operation, or dozens of digital scales in one store or factory that you choose to start with, or possibly just at a small handful of smartphones in your biggest department. What matters is what you’re working with right now; this is about the Internet of Your Things. Of course, not every business has figured out a strategy to take advantage of the Internet of Things yet, but that’s a right-now thing too: It’s easy to get started. You can just make a few key improvements within your organization and start seeing big impact. You don’t have to make overwhelming changes or overhaul your technology systems. The Internet of Things can be implemented one device at a time; to make it even easier, you probably already have some of those devices in place in the form of barcode scanners, point-of-sale devices, employee access badges or security cameras. What can one piece of data do for your business? How can it help you to have a clearer understanding of how many people are in your store each hour, or what the average temperature is in your largest machine room, or how long it takes to get prescriptions from your hospital pharmacy to the patients? Once you’ve identified the questions you want to answer, and the data that can help you answer them, you start to see which things can get you those answers. They may be sensors, barcode scanners, handheld point-of-sale devices, RFID tags, digital scales, or any number of other connected devices that can collect and relay data to the analytics tools that will help you find insights to move your business forward. You can start small, building on what you already have, adding new devices and tapping into new data streams as the Internet of Your Things grows into an ever-evolving feedback loop of insight and action. Here at Microsoft, we’ve spent years creating the platforms, services, business intelligence tools and partner ecosystem that can help you harness data and transform your business. What you do next is up to you; the Internet of Your Things is as big — or small — as you want it to be, and as unique as your business. Read more about the Internet of Your Things here.

Posted by on 24 April 2014 | 11:00 am

Modelo 303: ¿Es necesario diferenciar entre facturas y abonos?

Buenas tardes a todos!, estamos recibiendo varias consultas sobre la necesidad de diferenciar en el Modelo 303 entre Facturas y Abonos. Hemos creído conveniente crear un nuevo artículo de Blog para compartir con vosotros nuestra visión tras el análisis de la documentación publicada y cómo vemos que podríamos afrontarlo en Microsoft Dynamics NAV. Bajo nuestro punto de vista, no quedaba claro que hubiera que diferenciar entre facturas y abonos, aunque si se haya publicado una modificación del Modelo 303 en referencia a las facturas rectificativas. Tras analizarlo y revisar la Ley de IVA, efectivamente la solución no pasaría por realizar una modificación en NAV diferenciando entre facturas y abonos en las líneas de la declaración, ya quecomo la propia Ley de IVA indica (Art. 80) las facturas rectificativas pueden ser tanto positivas como negativas. Está claro que la mayoría de los casos y más comunes puede ser el tratamiento de facturas rectificativas negativas (conocido como abonos), pero el caso de las positivas también existe y debe ser contemplado. Si diferenciáramos en la declaración de IVA entre facturas y abonos, no todas las casuísticas estarían contempladas, y no sería correcto. Por tanto, la solución más correcta pasaría por crear nuevos grupos de registro de IVA a utilizar en todas las facturas rectificativas (bien positivas o negativas) – de la misma forma que se hace con el IVA no realizado - y de esta forma poderlo incluir correctamente en la casilla de la Declaración de IVA correspondiente, y cumplir así con la legalidad. En las líneas de Declaración de IVA en NAV se podría obtener la información de todas las facturas rectificativas, configurando las líneas tipo movimientos de IVA o de cuentas contables, con los nuevos grupos de registro de IVA creados para este propósito. Tanto para las línea de tipo importe, como cuota. Con esto podríamos diferenciar sin problema todas aquellas facturas rectificativas que se han generado en el sistema. Adjuntamos el link a la página de publicación del BOE donde puede verse el testo legal publicado:   Impuesto sobre el Valor Añadido. Actividades empresariales y profesionales. Declaración censal. Información tributaria http://www.boe.es/boe/dias/2013/11/29/pdfs/BOE-A-2013-12489.PDF   (PAG 95065/95077)   Esperamos que esto sirva para aclarar las dudas que os hayan podido surgir al respecto de esta nueva normativa. Saludos y hasta pronto, Equipo Dynamics NAV.    

Posted by on 24 April 2014 | 10:52 am

Universal Apps: Starting an exploration

The Template samples: http://code.msdn.microsoft.com/wpapps/Universal-Windows-app-cb3248c3 To use the templates you must be running Windows 8.1 and  upgrade your Visual Studio 2013 to: Now your file new project dialog will look like the following, you will see that there is a new category called Universal Apps, and four templates. End of the beginning of the universal app exploration....(read more)

Posted by on 24 April 2014 | 10:48 am

//build/, //learn/ and //publish/

Editor's note: The following post was written by Windows Phone Development MVP Peter Nowak If you have been one of the lucky ones to attend //build, or following the announcements of //build from home a ton of new announcements might have caught your eye. Especially looking into Windows Phone 8.1 not only consumers, but also developers get a lot of new functionality to enhance their apps, or to create new ones that span over Windows and Windows phone simultaneously. For Windows Phone //build was just the starting point to get new bits to play with. As dust settles now a bit it is time to take a look what really has been made available in this package. Do you know everything about it already? Did you know that you the new emulator features enhanced functionality to test geofencing properly, or that you can roam credentials among devices in a secured way? These were topics that were covered at //build briefly, but there is another event coming up for you: //learn. //learn is an online event by Microsoft powered by the MVP Community to deliver the content you might need for developing great apps for the Windows Phone and beyond. If you know the famous Jumpstart series by Microsoft for several products, than you know already how a lot of the sessions will be structured. But there is more – the sessions will be delivered in 8 different languages: Chinese, English, German, French, Portuguese, Spanish, Italian and Russian. These events also include local start times so that the information gets delivered as smooth as possible. The exciting thing is, that the base for this event has been created by Windows Phone Development MVPs, who wanted to deliver content independent to a venue. Having the chance to test drive this new concept with the WPDev Fusion - New Year Edition back in January led to //learn! And even The WPDev Fusion event was not the root- it all started last year in October with the Windows Phone week - a worldwide initiative of Windows Phone Development MVPs, who brought 17 worldwide events with over 1500 attendees closer to developing apps for Windows Phone. Register yourself for April 24th to be a part of //learn here: Chinese Simplified: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=2052&GroupID=ChineseS Chinese Traditional: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1028&GroupID=ChineseT English: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1033&GroupID=english French: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1036&GroupID=french German: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1031&GroupID=german Italian: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1040&GroupID=Italian Russian: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1049&GroupID=russian Portuguese: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1046&GroupID=portuguese Spanish: https://vts.inxpo.com/Launch/QReg.htm?ShowKey=18934&LangLocaleID=1034&GroupID=spanish   Getting the tools and getting the education are only 2 parts of a trinity. This is, where //publish comes in. //publish by itself is a hackathon with a shifted focus. Instead of getting over to start a new app from scratch (which you still can do) the idea here is to finish an already started app and to get it published to the store. And the base idea here is great. How often did you start already creating an app, which hasn’t been published as you found a problem which you might not getting solved or where you’d need a professional advice regarding best practices to get back on track? This is the strength of //publish. If you are interested to attend a //publish event close on May 16th and 17th, you might want to check the website publishwindows.com for further information and to find a venue close to you. The event is also supported by MVPs and Nokia Developer Champions as well. With this I hope you will have a lot of fun creating apps and learn how easy it is.  

Posted by on 24 April 2014 | 10:40 am

Workaround für FileLoadException mit Microsoft.WindowsAzure.Storage Version 3.1.0.1

Wer aktuell Probleme bei der Ausführung der Windows Azure Storage Bibliothek Version 3.1.0.1 hat und die Fehlermeldung bekommt, dass die Assembly Microsoft.Data.Services.Client, Version 5.6.0.0 nicht gefunden werden kann, wird der folgende Workaround interessieren. Problem Wenn ich stand heute (24.04.2014) das NuGet-Paket für die Windows Azure Storage Bibliothek installieren möchte, erhalte ich die folgende Fehlermeldung, sobald ich Blobs, Tables oder Queues erstellen möchte. An unhandled exception of type 'System.IO.FileLoadException' occurred in Microsoft.WindowsAzure.Storage.dll Additional information: Die Datei oder Assembly "Microsoft.Data.Services.Client, Version=5.6.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" oder eine Abhängigkeit davon wurde nicht gefunden. Die gefundene Manifestdefinition der Assembly stimmt nicht mit dem Assemblyverweis überein. (Ausnahme von HRESULT: 0x80131040) Lösung Das Problem liegt anscheinend darin begründet, dass bei der Installation des NuGet-Pakets WindowsAzure.Storage die Assembly Microsoft.Data.Services.Client in der Version 5.6.1 installiert wird, womit die Azure Storage Bibliothek Version 3.1.0.1 nicht kompatibel ist. Also deinstalliere ich alle relevanten Assemblies und installiere sie mit der korrekten Version erneut. Für die folgenden Schritte kann es hilfreich sein, die Datei packages.config zu öffnen, damit man weiß, wie die NuGet-Pakete heißen. <?xml version="1.0" encoding="utf-8"?> <packages>   <package id="Microsoft.Data.Edm" version="5.6.1" targetFramework="net45" />   <package id="Microsoft.Data.OData" version="5.6.1" targetFramework="net45" />   <package id="Microsoft.Data.Services.Client" version="5.6.1" targetFramework="net45" />   <package id="Microsoft.WindowsAzure.ConfigurationManager" version="1.8.0.0" targetFramework="net45" />   <package id="Newtonsoft.Json" version="5.0.8" targetFramework="net45" />   <package id="System.Spatial" version="5.6.1" targetFramework="net45" />   <package id="WindowsAzure.Storage" version="3.1.0.1" targetFramework="net45" /> </packages> Über das Menü Tools, Library Package Manager die Package Manager Console aufrufen. Über die Package Manager Console deinstallieren wir der Reihe nach folgende Pakete: WindowsAzure.Storage Microsoft.Data.Services.Client Microsoft.Data.OData Microsoft.Data.Edm System.Spatial Die jeweils erste und fettgedruckte Zeile enthält den einzugebenden Befehl. PM> UnInstall-Package WindowsAzure.Storage Removing 'WindowsAzure.Storage 3.1.0.1' from ConsoleQueues. Successfully removed 'WindowsAzure.Storage 3.1.0.1' from ConsoleQueues. Uninstalling 'WindowsAzure.Storage 3.1.0.1'. Successfully uninstalled 'WindowsAzure.Storage 3.1.0.1'. PM> UnInstall-Package Microsoft.Data.Services.Client Removing 'Microsoft.Data.Services.Client 5.6.1' from ConsoleQueues. Successfully removed 'Microsoft.Data.Services.Client 5.6.1' from ConsoleQueues. Uninstalling 'Microsoft.Data.Services.Client 5.6.1'. Successfully uninstalled 'Microsoft.Data.Services.Client 5.6.1'. PM> Uninstall-Package Microsoft.Data.OData Removing 'Microsoft.Data.OData 5.6.1' from ConsoleQueues. Successfully removed 'Microsoft.Data.OData 5.6.1' from ConsoleQueues. Uninstalling 'Microsoft.Data.OData 5.6.1'. Successfully uninstalled 'Microsoft.Data.OData 5.6.1'. PM> UnInstall-Package Microsoft.Data.Edm Removing 'Microsoft.Data.Edm 5.6.1' from ConsoleQueues. Successfully removed 'Microsoft.Data.Edm 5.6.1' from ConsoleQueues. Uninstalling 'Microsoft.Data.Edm 5.6.1'. Successfully uninstalled 'Microsoft.Data.Edm 5.6.1'. PM> UnInstall-Package System.Spatial Removing 'System.Spatial 5.6.1' from ConsoleQueues. Successfully removed 'System.Spatial 5.6.1' from ConsoleQueues. Uninstalling 'System.Spatial 5.6.1'. Successfully uninstalled 'System.Spatial 5.6.1'. Jetzt die Version 5.6.0 von Microsoft.Data.Services.Client installieren und danach wieder WindowsAzure.Storage. PM>  Install-Package Microsoft.Data.Services.Client -Version 5.6.0 Attempting to resolve dependency 'Microsoft.Data.OData (= 5.6.0)'. Attempting to resolve dependency 'System.Spatial (= 5.6.0)'. Attempting to resolve dependency 'Microsoft.Data.Edm (= 5.6.0)'. Installing 'System.Spatial 5.6.0'. You are downloading System.Spatial from Microsoft Corporation, the license agreement to which is available at http://go.microsoft.com/?linkid=9809688. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'System.Spatial 5.6.0'. Installing 'Microsoft.Data.Edm 5.6.0'. You are downloading Microsoft.Data.Edm from Microsoft Corporation, the license agreement to which is available at http://go.microsoft.com/?linkid=9809688. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'Microsoft.Data.Edm 5.6.0'. Installing 'Microsoft.Data.OData 5.6.0'. You are downloading Microsoft.Data.OData from Microsoft Corporation, the license agreement to which is available at http://go.microsoft.com/?linkid=9809688. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'Microsoft.Data.OData 5.6.0'. Installing 'Microsoft.Data.Services.Client 5.6.0'. You are downloading Microsoft.Data.Services.Client from Microsoft Corporation, the license agreement to which is available at http://go.microsoft.com/?linkid=9809688. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'Microsoft.Data.Services.Client 5.6.0'. Adding 'System.Spatial 5.6.0' to ConsoleQueues. Successfully added 'System.Spatial 5.6.0' to ConsoleQueues. Adding 'Microsoft.Data.Edm 5.6.0' to ConsoleQueues. Successfully added 'Microsoft.Data.Edm 5.6.0' to ConsoleQueues. Adding 'Microsoft.Data.OData 5.6.0' to ConsoleQueues. Successfully added 'Microsoft.Data.OData 5.6.0' to ConsoleQueues. Adding 'Microsoft.Data.Services.Client 5.6.0' to ConsoleQueues. Successfully added 'Microsoft.Data.Services.Client 5.6.0' to ConsoleQueues. PM> Install-Package WindowsAzure.Storage Attempting to resolve dependency 'Microsoft.Data.OData (≥ 5.6.0)'. Attempting to resolve dependency 'System.Spatial (= 5.6.0)'. Attempting to resolve dependency 'Microsoft.Data.Edm (= 5.6.0)'. Attempting to resolve dependency 'Newtonsoft.Json (≥ 5.0.6)'. Attempting to resolve dependency 'Microsoft.Data.Services.Client (≥ 5.6.0)'. Attempting to resolve dependency 'Microsoft.WindowsAzure.ConfigurationManager (≥ 1.8.0.0)'. Installing 'WindowsAzure.Storage 3.1.0.1'. You are downloading WindowsAzure.Storage from Microsoft, the license agreement to which is available at http://go.microsoft.com/fwlink/?LinkId=331471. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'WindowsAzure.Storage 3.1.0.1'. Adding 'WindowsAzure.Storage 3.1.0.1' to ConsoleQueues. Successfully added 'WindowsAzure.Storage 3.1.0.1' to ConsoleQueues. Dies hat das oben beschriebene Problem bei mir gelöst. Auf die Lösung bin ich Dank des Beitrags “Missing Microsoft.Data.Services.Client version 5.6 on Azure Websites” auf Stack Overflow gekommen. Vielen Dank an dieser Stelle! Wem das jetzt zu viel Text war, hier noch einmal alle einzugebenden Befehle zusammengefasst: UnInstall-Package WindowsAzure.Storage UnInstall-Package Microsoft.Data.Services.Client Uninstall-Package Microsoft.Data.OData UnInstall-Package Microsoft.Data.Edm UnInstall-Package System.Spatial Install-Package Microsoft.Data.Services.Client -Version 5.6.0 Install-Package WindowsAzure.Storage Zum Vergleich, danach sieht die packages.config folgendermaßen aus. Man sieht, dass nun die Version 5.6.0 installiert wurde. <?xml version="1.0" encoding="utf-8"?> <packages>   <package id="Microsoft.Data.Edm" version="5.6.0" targetFramework="net45" />   <package id="Microsoft.Data.OData" version="5.6.0" targetFramework="net45" />   <package id="Microsoft.Data.Services.Client" version="5.6.0" targetFramework="net45" />   <package id="Microsoft.WindowsAzure.ConfigurationManager" version="1.8.0.0" targetFramework="net45" />   <package id="Newtonsoft.Json" version="5.0.8" targetFramework="net45" />   <package id="System.Spatial" version="5.6.0" targetFramework="net45" />   <package id="WindowsAzure.Storage" version="3.1.0.1" targetFramework="net45" /> </packages>

Posted by on 24 April 2014 | 10:25 am