Microsoft Azure Web Sites: Deploying wordpress to a virtual directory within the azure web site

Microsoft Azure Web Sites allows you to have a virtual directory created within the site. There are many advantages to this. Consider a scenario where you your org's site is deployed to root http://<sitename>.azurewebsites.net. You now want to have separate branches for different departments within your org. For example: http://<sitename>.azurewebsites.net/marketing http://<sitename>.azurewebsites.net/sales http://<sitename>.azurewebsites.net/hr Another example could be where you would want to setup a blog within your site. In this article I will demonstrate to deploy wordpress to virtual directory called BLOG within my site. Here is my current set-up. SiteName: Kaushal HostName: kaushal.azurewebsites.net Application: ASP.NET MVC No databases are currently linked to my site I would host the wordpress under my site so that it is accessible under http://kaushal.azurewebsites.net/blog. Here is what we need to do. Create a virtual directory within my site called BLOG via azure portal and link a MySQL database to this site. On my Local Machine, download and install WordPress via WebMatrix and deploy it to the virtual directory we created above. Sounds easy right? Let's go ahead and deploy it. Microsoft Azure Portal Logon to Azure portal. Go to the CONFIGURE page for the site and scroll to the virtual applications and directories section at the bottom of the page. Add an entry as seen in the below image:  Click on SAVE. Now go to the LINKED RESOURCES page and link a MySQL database to your site. NOTE: Choose an existing MySQL DB or create a new one. Let's say you already have a free MySQL DB associated with your subscription but you want a separate MySQL database for the application. You will have to purchase a plan from CLEARDB for this  Once, linked. Go to the DASHBOARD page. Under quick glance section a hyperlink called View connection strings will be created. NOTE: You could retrieve the connection string parameters from the LINKED RESOURCES page too. Click on MANAGE in the bottom pane for the site. This will redirect you to ClearDB site which will provide you with the following Database Data Source User Id Password  Download & save the publishsettings file for the website by clicking the hyperlink "Download the publish profile" under quick glance section of the DASHBOARD page. Local Machine Launch Microsoft WebMatrix Click on New -> App Gallery Select WordPress from the App Gallery and click on Next. This will take you to through the WordPress setup. Accept the EULA by clicking on "I ACCEPT" Once done it will start downloading the contents to your local machine (C:\Users\<username>\Documents\My Web Sites\WordPress) During these process it allows you to configure certain application parameters as shown below: Once you specify the parameters and click on Next it proceeds with the installation. Once installed, click on "Copy user names and passwords". This will copy the details to clipboard which you could save in a text file. Click on OK. Now click on Publish This will prompt you with another window. Click on Import publish profile and point it to the location where we saved the publishsettings file we downloaded earlier. Once selected, it will auto-populate the parameters from the publishsettings file. We need to modify the following sections as shown below: Site name: kaushal\wordpress (physical path location relative to the root site) Destination URL: http://kaushal.azurewebsites.net/blog (virtual directory name)  NOTE: Don't chose FTP as the protocol as it doesn't allow you to publish databases.  Click on Validate Connection. Once validated, you will see the confirmation. Click on Save. This will take you to the Publish Compatibility page. Click on Continue. Once compatibility check has been performed. Click on Continue again. It will display the list of files that will deployed to the server. Click on Continue to start the deployment. Once publishing is completed you could open the log file and analyze. Click on the hyperlink as shown below to browse to the site: HTH, Kaushal    

Posted by on 19 April 2014 | 5:34 am

[Sample Of Apr 18th] How to Import vCard Files in Office 365 Exchange Online

  Sample Download : http://code.msdn.microsoft.com/How-to-Import-vCard-Files-ffa0ff50 The vCard file format is supported by many email clients and email services. Now Outlook Web App supports to import the single .CSV file only. In this application, we will demonstrate how to import multiple vCard files in Office 365 Exchange Online. 1. Get a single file or all the vCard files in the folder; 2. Read the contact information from the vCard file; 3. Create a new contact and set the properties; 4. Save the contact 5. Process 2-4 steps for all the vCard files.  You can find more code samples that demonstrate the most typical programming scenarios by using Microsoft All-In-One Code Framework Sample Browser or Sample Browser Visual Studio extension. They give you the flexibility to search samples, download samples on demand, manage the downloaded samples in a centralized place, and automatically be notified about sample updates. If it is the first time that you hear about Microsoft All-In-One Code Framework, please watch the introduction video on Microsoft Showcase, or read the introduction on our homepage http://1code.codeplex.com/.

Posted by on 19 April 2014 | 3:11 am

Windows Phone Dev Center Changes + Credit card validation no longer required

After Build we have had many announcements regarding to the Windows Store and the Store Dev Centers. It is not the objective of this discussion to talk about the improvements on the Windows and Windows Phone Store so before go to the dev side let’s just say that now you have the opportunity to build once and deploy for both stores thanks to Universal Apps, and not only that, buy ONCE and have the app on both Operating Systems at no extra cost ^_^ Let’s go through the two major changes on the Store Dev accounts: 1) Credit card validation no longer required for the registration process This is something that mostly students were looking for years, to be able to create their own developer account without the need to get a credit card for the account verification (remember that for Students the account is FREE thanks to the BizSpark program but requires to verify your identity with a credit card). Also whenever 1 full year old comes to your account, in the renewal you will have same options for it + now we are enabling PayPal as a renewal or even registration payment for the Windows Store (in markets where PayPal is currently supported). 2) New feedback features: Microsoft is slowly rolling out a program whereby developers can comment on your reviews of their handiwork so you would soon be able to respond to user reviews of their apps and games. Here a funny demonstration As developer you will receive these kind of notifications where you will be able to control what’s going on with your “open cases” But not only that, is not only for debating personal opinions about the app, Windows Phone users are encouraged to report any questionable developer response via the reporting link in the “details” section of the app’s description:    As a user: remember that your feedback can make the applications you own better as in the end is what you want when you purchase a game or an app. As a developer: remember that your users own your app because they think that is cool and they like it, they use it, don’t disappoint them and provide them the best quality, the best performance and regular updates. 3) Linking Windows Store and Windows Phone apps to create a universal Windows app Tired to pay twice for the same app? Now with Universal Apps, ‘get once and download for all compatible Windows devices’ customer experience, which we expect to increase both paid and free app downloads across device types. Also, if you are integrating in-app purchases in your apps, this linked app experience extends your durables and consumables to be used in both stores using the same identifier. 4) App name reservation Developers now can reserve names for new Windows Phone apps for up to 12 months in advance of release 5) Consolidated price tiers We have simplified the pricing which is applicable for paid apps and in-app purchasing and expands Windows developer opportunity with the addition of US$0.99 and $1.39 price tiers to Windows Store. 6) Consistent certification policies 7) Reduced certification times: x10 faster! There it is, we have reduced the app certification workflow time where in most cases the reduction comes to few hours vs. few days (previously). COMING SOON: Promotional pricing Pre-submission validation checks Touch-enabled device targeting Summary: As you can see there are GREAT improvements and changes on both Windows and Windows Phone Store, everything pointing to the same direction, build once, deploy everywhere. You have no excuse to start deploying for Windows Phone and mark your own revenue model and success on the Windows Store! Happy submission and - May the code be with you - Sources: http://www.engadget.com/2014/04/18/microsoft-app-store-developer-responses-roll-out/ http://blogs.windows.com/windows_phone/b/windowsphone/archive/2014/04/17/you-may-soon-get-a-response-to-your-windows-phone-app-review.aspx http://blogs.windows.com/windows/b/buildingapps/archive/2014/04/14/dev-center-now-open-for-windows-phone-8-1-and-universal-windows-app-submissions.aspx

Posted by on 19 April 2014 | 12:50 am

Windows Phone Dev Center Changes + Credit card validation no longer required

After Build we have had many announcements regarding to the Windows Store and the Store Dev Centers. It is not the objective of this discussion to talk about the improvements on the Windows and Windows Phone Store so before go to the dev side let’s just say that now you have the opportunity to build once and deploy for both stores thanks to Universal Apps, and not only that, buy ONCE and have the app on both Operating Systems at no extra cost ^_^ Let’s go through the two major changes on the Store Dev accounts: 1) Credit card validation no longer required for the registration process This is something that mostly students were looking for years, to be able to create their own developer account without the need to get a credit card for the account verification (remember that for Students the account is FREE thanks to the BizSpark program but requires to verify your identity with a credit card). Also whenever 1 full year old comes to your account, in the renewal you will have same options for it + now we are enabling PayPal as a renewal or even registration payment for the Windows Store (in markets where PayPal is currently supported). 2) New feedback features: Microsoft is slowly rolling out a program whereby developers can comment on your reviews of their handiwork so you would soon be able to respond to user reviews of their apps and games. Here a funny demonstration As developer you will receive these kind of notifications where you will be able to control what’s going on with your “open cases” But not only that, is not only for debating personal opinions about the app, Windows Phone users are encouraged to report any questionable developer response via the reporting link in the “details” section of the app’s description:    As a user: remember that your feedback can make the applications you own better as in the end is what you want when you purchase a game or an app. As a developer: remember that your users own your app because they think that is cool and they like it, they use it, don’t disappoint them and provide them the best quality, the best performance and regular updates. 3) Linking Windows Store and Windows Phone apps to create a universal Windows app Tired to pay twice for the same app? Now with Universal Apps, ‘get once and download for all compatible Windows devices’ customer experience, which we expect to increase both paid and free app downloads across device types. Also, if you are integrating in-app purchases in your apps, this linked app experience extends your durables and consumables to be used in both stores using the same identifier. 4) App name reservation Developers now can reserve names for new Windows Phone apps for up to 12 months in advance of release 5) Consolidated price tiers We have simplified the pricing which is applicable for paid apps and in-app purchasing and expands Windows developer opportunity with the addition of US$0.99 and $1.39 price tiers to Windows Store. 6) Consistent certification policies 7) Reduced certification times: x10 faster! There it is, we have reduced the app certification workflow time where in most cases the reduction comes to few hours vs. few days (previously). COMING SOON: Promotional pricing Pre-submission validation checks Touch-enabled device targeting Summary: As you can see there are GREAT improvements and changes on both Windows and Windows Phone Store, everything pointing to the same direction, build once, deploy everywhere. You have no excuse to start deploying for Windows Phone and mark your own revenue model and success on the Windows Store! Happy submission and - May the code be with you - Sources: http://www.engadget.com/2014/04/18/microsoft-app-store-developer-responses-roll-out/ http://blogs.windows.com/windows_phone/b/windowsphone/archive/2014/04/17/you-may-soon-get-a-response-to-your-windows-phone-app-review.aspx http://blogs.windows.com/windows/b/buildingapps/archive/2014/04/14/dev-center-now-open-for-windows-phone-8-1-and-universal-windows-app-submissions.aspx

Posted by on 19 April 2014 | 12:48 am

Blog actualizado v1.2 [19/04/2014]

Informativo Si eres seguidor de mi blog este mini-post será de tu interés, he realizado una actualización de mantenimiento. Framework update Actualización a la versión 0.4.2 de Ghost Actualización a la versión más reciente del tema (gamma ...read more...(read more)

Posted by on 19 April 2014 | 12:40 am

Registration starts - Redmond Interoperability Protocols Plugfest 2014!!

Microsoft-hosted protocol plugfests provide software developers with the opportunity to learn more about the Microsoft protocols and to test their implementations of the Microsoft Open Specifications. Hosted on the Microsoft Redmond campus, each plugfest focuses on a specific task or technology area. Presentations are conducted by Microsoft engineers, who are also available for one-on-one and group discussions and to provide necessary assistance with configuration and running of the interoperability...(read more)

Posted by on 19 April 2014 | 12:39 am

マイクロソフトのIOT(Internet of Things) =Internet of Your Things

今週、サンフランシスコでイベントがあって、モノがつながって連携しデータを蓄積活用するための、M2Mやセンサークラウドなどを管理するためのAzure上のサービス、Azure Intelligent Systems Service、が発表されました。 http://www.InternetOfYourThings.com キーワードは、Internet of Your Things 貴方のモノからつないでいきましょう的な感じです。各種デバイスを管理するAzure上のサービスとデバイス側のSDKのプレビュー提供が開始されています。 http://blogs.msdn.com/b/windows-embedded/archive/2014/04/15/microsoft-azure-intelligent-systems-service-limited-public-preview-now-available.aspx お試しください。5月末のde:codeでは、このサービスの一部の情報もセッションで扱おうと思ってます。 サンフランシスコのイベントの模様は、 https://www.microsoft.com/en-us/server-cloud/whats-new.aspx#fbid=YJudAz7bV_f?bid=YJudAz7bV_f でストリーミングが見れるのでこちらも是非。他に、SQL Server 2014と、Analytics Platform Serviceが発表されてます。 インテルのGalileoでWindowsカーネルが動くとか、.NET Micro Frameworkで動いているデバイスとか、BUILDで披露されていましたが、楽しくなってきましたね。    

Posted by on 19 April 2014 | 12:31 am

Feature comparison: EWS vs. EWS Managed API

Are you a .Net Developer who develop custom application using Exchange Web Services (EWS) Managed API or EWS (Auto-generated proxies)? Then this is for you. The EWS Managed API provides an intuitive interface for developing client applications that use EWS. The API enables unified access to Exchange resources, while using Outlook–compatible business logic. In short, you can use the EWS Managed API to access EWS in versions of Exchange starting with Exchange Server 2007 Service Pack 1 (SP1), including...(read more)

Posted by on 18 April 2014 | 11:23 pm

MIDMARKET SOLUTION PROVIDER – April 2014 Readiness Update

New video: Office 365 training Grow your cloud expertise and help your customers move to the cloud. Learn how to get started with Microsoft Office 365 training—for sales and technical professionals—through this fun new video. Do you know about Practice Accelerator? Practice Accelerator sessions, designed for technical consultants and architects, enable you and your organization to increase skills in a specific solution or services area. Learn more about Practice Accelerator through this fun, informative, short video.

Posted by on 18 April 2014 | 7:16 pm

ASP.NET issue with auto-generated designer page

I have been facing this issue with VS2013, whenever I change my .aspx file (.NET framework 4.5) with updatePanel/Scriptmanager, designer file generate for that control is: /// <summary>         /// ScriptManager1 control.         /// </summary>         /// <remarks>         /// Auto-generated field.         /// To modify move field declaration from designer file to code-behind file.         /// </remarks>         protected global::System.Web.UI.WebControls.ScriptManager ScriptManager1;                 /// <summary>         /// UpdatePanel1 control.         /// </summary>         /// <remarks>         /// Auto-generated field.         /// To modify move field declaration from designer file to code-behind file.         /// </remarks>         protected global::System.Web.UI.WebControls.UpdatePanel UpdatePanel1; But I have to change it to : /// <summary>         /// ScriptManager1 control.         /// </summary>         /// <remarks>         /// Auto-generated field.         /// To modify move field declaration from designer file to code-behind file.         /// </remarks>         protected global::System.Web.UI.ScriptManager ScriptManager1;                  /// <summary>         /// UpdatePanel1 control.         /// </summary>         /// <remarks>         /// Auto-generated field.         /// To modify move field declaration from designer file to code-behind file.         /// </remarks>         protected global::System.Web.UI.UpdatePanel UpdatePanel1;   Fix: 1) Either reset the System.Web.UI.WebControls.WebParts.UpdatePanel back to System.Web.UI.UpdatePanel (same for ScriptManager) every time the ascx file is modified...[recommended] but tedious 2) I found that using a Register command at the top of the ASCX file seemed to properly override the default behavior of the designer to pick the 4.0 location for the 3.5 control (I think that is the underlying issue, it is a 4.0 designer backwards compatible with 3.5). [Recommended] but be careful while adding controls with <asp:    <%@ Register TagPrefix="asp" Namespace="System.Web.UI" Assembly="System.Web"%> ...  <asp:ScriptManager runat="server" ID="smLocationsMap" /> 3) You can include system.web.dll, and system.web.design. dll in your bin folder (Security/ Other issues) [Not recommended]   I hope this is helpful :)  

Posted by on 18 April 2014 | 6:30 pm

Unit of Work - Expanded

In a previous post I discussed asynchronous repositories. A closely related and complimentary design pattern is the Unit of Work pattern. In this post, I'll summarize the design pattern and cover a few non-conventional, but useful extensions. Overview The Unit of Work is a common design pattern used to manage the state changes to a set of objects. A unit of work abstracts all of the persistence operations and logic from other aspects of an application. Applying the pattern not only simplifies code that possess persistence needs, but it also makes changing or otherwise swapping out persistence strategies and methods easy. A basic unit of work has the following characteristics: Register New - registers an object for insertion. Register Updated - registers an object for modification. Register Removed - registers an object for deletion. Commit - commits all pending work. Rollback - discards all pending work. Extensions The basic design pattern supports most scenarios, but there are a few additional use cases that are typically not addressed. For stateful applications, it is usually desirable to support cancellation or simple undo operations by using deferred persistence. While this capability is covered via a rollback, there is not a way to interrogate whether a unit of work has pending changes. Imagine your application has the following requirements: As a user, I should only be able to save when there are uncommitted changes. As a user, I should be prompted when I cancel an operation with uncommitted changes. To satisfy these requirements, we only need to make a couple of additions: Unregister - unregisters pending work for an object. Has Pending Changes - indicates whether the unit of work contains uncommitted items. Property Changed - raises an event when a property has changed. Generic Interface After reconsidering what is likely the majority of all plausible usage scenarios, we now have enough information to create a general purpose interface. public interface IUnitOfWork<T> : INotifyPropertyChanged where T : class{    bool HasPendingChanges    {        get;    }    void RegisterNew( T item );    void RegisterChanged( T item );    void RegisterRemoved( T item );    void Unregister( T item );    void Rollback();    Task CommitAsync( CancellationToken cancellationToken );} Base Implementation It would be easy to stop at the generic interface definition, but we can do better. It is pretty straightforward to create a base implementation that handles just about everything except the commit operation. public abstract class UnitOfWork<T> : IUnitOfWork<T> where T : class{    private readonly IEqualityComparer<T> comparer;    private readonly HashSet<T> inserted;    private readonly HashSet<T> updated;    private readonly HashSet<T> deleted;    protected UnitOfWork()    protected UnitOfWork( IEqualityComparer<T> comparer )    protected IEqualityComparer<T> Comparer { get; }    protected virtual ICollection<T> InsertedItems { get; }    protected virtual ICollection<T> UpdatedItems { get; }    protected virtual ICollection<T> DeletedItems { get; }    public virtual bool HasPendingChanges { get; }    protected virtual void OnPropertyChanged( PropertyChangedEventArgs e )    protected virtual void AcceptChanges()    protected abstract bool IsNew( T item )    public virtual void RegisterNew( T item )    public virtual void RegisterChanged( T item )    public virtual void RegisterRemoved( T item )    public virtual void Unregister( T item )    public virtual void Rollback()    public abstract Task CommitAsync( CancellationToken cancellationToken );    public event PropertyChangedEventHandler PropertyChanged;} Obviously by now, you've noticed that we've added a few protected members to support the implementation. We use HashSet<T> to track all inserts, updates, and deletes. By using HashSet<T>, we can easily ensure we don't track an entity more than once. We can also now apply some basic logic such as inserts should never enqueue for updates and deletes against uncommitted inserts should be negated. In addition, we add the ability to accept (e.g. clear) all pending work after the commit operation has completed successfully. Supporting a Unit of Work Service Locator Once we have all the previous pieces in place, we could again stop, but there are multiple ways in which a unit of work could be used in an application that we should consider: Imperatively instantiated in code Composed or inserted via dependency injection Centrally retrieved via a special service locator facade The decision as to which approach to use is at a developer's discretion. In general, when composition or dependency injection is used, the implementation is handed by another library and some mediating object (ex: a controller) will own the logic as to when or if entities are added to the unit of work. When a service locator is used, most or all of the logic can be baked directly into an object to enable self-tracking. In the rest of this section, we'll explore a UnitOfWork singleton that plays the role of a service locator. public static class UnitOfWork{    public static IUnitOfWorkFactoryProvider Provider    {        get;        set;    }    public static IUnitOfWork<TItem> Create<TItem>() where TItem : class    public static IUnitOfWork<TItem> GetCurrent<TItem>() where TItem : class    public static void SetCurrent<TItem>( IUnitOfWork<TItem> unitOfWork ) where TItem : class    public static IUnitOfWork<TItem> NewCurrent<TItem>() where TItem : class} Populating the Service Locator In order to locate a unit of work, the locator must be backed with code that can resolve it. We should also consider composite applications where there may be many units of work defined by different sources. The UnitOfWork singleton is configured by supplying an instance to the static Provider property. Unit of Work Factory Provider The IUnitOfWorkFactoryProvider interface can simply be thought of as a factory of factories. It provides a central mechanism for the service locator to resolve a unit of work via all known factories. In composite applications, implementers will likely want to use dependency injection. For ease of use, a default implementation is provided whose constructor accepts Func<IEnumerable<IUnitOfWorkFactory>>. public interface IUnitOfWorkFactoryProvider{    IEnumerable<IUnitOfWorkFactory> Factories    {        get;    }} Unit of Work Factory The IUnitOfWorkFactory interface is used to register, create, and resolve units of work. Implementers have the option to map as many units of work to a factory as they like. In most scenarios, only one factory is required per application or composite component (ex: plug-in). A default implementation is provided that only requires the factory to register a function to create or resolve a unit of work for a given type. The Specification pattern is used to match or select the appropriate factory, but the exploration of that pattern is reserved for another time. public interface IUnitOfWorkFactory{    ISpecification<Type> Specification    {        get;    }    IUnitOfWork<TItem> Create<TItem>() where TItem : class;    IUnitOfWork<TItem> GetCurrent<TItem>() where TItem : class;    void SetCurrent<TItem>( IUnitOfWork<TItem> unitOfWork ) where TItem : class;} Minimizing Test Setup While all of factory interfaces make it flexible to support a configurable UnitOfWork singleton, it is somewhat painful to set up test cases. If the required unit of work is not resolved, an exception will be thrown; however, if the test doesn't involve a unit of work, why should we have to set one up? To solve this problem, the service locator will internally create a compatible uncommitable unit of work instance whenever a unit of work cannot be resolved. This behavior allows self-tracking objects to be used without having to explicitly set up a mock or stub unit of work. You might be thinking that this behavior hides composition or dependency resolution failures and that is true. However, any attempt to commit against these instances will throw an InvalidOperationException, indicating that the unit of work is uncommitable. This approach is the most sensible method of avoiding unnecessary setups, while not completely hiding resolution failures. Whenever a unit of work fails in this manner, a developer should realize that they have not set up their test correctly (ex: verifying commit behavior) or resolution is failing at run time. Examples The following outlines some scenarios as to how a unit of work might be used. For each example, we'll use the following model: public class Person{    public int PersonId { get; set;}    public string FirstName { get; set; }    public string LastName { get; set; }} Implementing a Unit of Work with the Entity Framework The following demonstrates a simple unit of work that is backed by the Entity Framework: public class PersonUnitOfWork : UnitOfWork<Person>{    protected override bool IsNew( Person item )    {        // any unsaved item will have an unset id        return item.PersonId == 0;    }    public override async Task CommitAsync( CancellationToken cancellationToken )    {        using ( var context = new MyDbContext() )        {            foreach ( var item in this.Inserted )                context.People.Add( item );            foreach ( var item in this.Updated )                context.People.Attach( item );            foreach ( var item in this.Deleted )                context.People.Remove( item );            await context.SaveChangesAsync( cancellationToken );        }        this.AcceptChanges();    }} Using a Unit of Work to Drive User Interactions The following example illustrates using a unit of work in a rudimentary Windows Presentation Foundation (WPF) window that contains buttons to add, remove, cancel, and apply (or save) changes to a collection of people. The recommended approach to working with presentation layers such as WPF is to use the Model-View-View Model (MVVM) design pattern. For the sake of brevity and demonstration purposes, this example will use simple, albeit difficult to test, event handlers. All of the persistence logic is contained within the unit of work and the unit of work can report whether it has any pending work to help inform a user when there are changes. The unit of work can also be used to verify that the user truly wants to discard uncommitted changes, if there are any. public partial class MyWindow : Window{    private readonly IUnitOfWork<Person> unitOfWork;    public MyWindow() : this( new PersonUnitOfWork() ) { }    public MyWindow( IUnitOfWork<Person> unitOfWork )    {        this.InitializeComponent();        this.ApplyButton.IsEnabled = false;        this.People = new ObservableCollection<Person>();        this.unitOfWork = unitOfWork;        this.unitOfWork.PropertyChanged +=            ( s, e ) => this.ApplyButton.IsEnabled = this.unitOfWork.HasPendingChanges;    }    public Person SelectedPerson { get; set; }    public ObservableCollection<Person> People { get; private set; }    private void AddButton_Click( object sender, RoutedEventArgs e )    {        var person = new Person();        // TODO: custom logic        this.People.Add( person );        this.unitOfWork.RegisterNew( person );    }    private void RemoveButton_Click( object sender, RoutedEventArgs e )    {        var person = this.SelectedPerson;        if ( person == null ) return;        this.People.Remove( person );        this.unitOfWork.RegisterRemoved( person );    }    private async void ApplyButton_Click( object sender, RoutedEventArgs e )    {        await this.unitOfWork.CommitAsync( CancellationToken.None );    }    private void CancelButton_Click( object sender, RoutedEventArgs e )    {        if ( this.unitOfWork.HasPendingChanges )        {            var message = "Discard unsaved changes?";            var title = "Save";            var buttons = MessageBoxButton.YesNo;            var answer = MessageBox.Show( message, title, buttons );            if ( answer == DialogResult.No ) return;            this.unitOfWork.Rollback();        }        this.Close();    }} Implementing a Self-Tracking Entity There are many different ways and varying degrees of functionality that can be implemented for a self-tracking entity. The following is one of many possibilities that illustrates just enough to convey the idea. The first thing we need to do is create a factory. public class MyUnitOfWorkFactory : UnitOfWorkFactory{    public MyUnitOfWorkFactory()    {        this.RegisterFactoryMethod( () => new PersonUnitOfWork() );        // additional units of work could be defined here    }} Then we need to wire up the service locator with a provider that contains the factory. var factories = new IUnitOfWorkFactory[]{ new MyUnitOfWorkFactory() };UnitOfWork.Provider = new UnitOfWorkFactoryProvider( () => factories ); Finally, we can refactor the entity to enable self-tracking. public class Person{    private string firstName;    private string lastName;    public int PersonId    {        get;        set;    }    public string FirstName    {        get        {            return this.firstName;        }        set        {            this.firstName = value;            UnitOfWork.GetCurrent<Person>().RegisterChanged( this );        }    }    public string LastName    {        get        {            return this.lastName;        }        set        {            this.lastName = value;            UnitOfWork.GetCurrent<Person>().RegisterChanged( this );        }    }    public static Person CreateNew()    {        var person = new Person();        UnitOfWork.GetCurrent<Person>().RegisterNew( person );        return person;    }    public void Delete()    {        UnitOfWork.GetCurrent<Person>().RegisterRemoved( this );    }    public Task SaveAsync()    {        return UnitOfWork.GetCurrent<Person>().CommitAsync( CancellationToken.None );    }} Conclusion In this article we examined the Unit of Work pattern, added a few useful extensions to it, and demonstrated some common uses cases as to how you can apply the pattern. There are many implementations for the Unit of Work pattern and the concepts outlined in this article are no more correct than any of the alternatives. Hopefully you finish this article with a better understanding of the pattern and its potential uses. Although I didn't explicitly discuss unit testing, my belief is that most readers will recognize the benefits and ease in which cross-cutting persistence requirements can be tested using a unit of work. I've attached all the code required to leverage the Unit of Work pattern as described in this article in order to accelerate your own development, should you choose to do so.

Posted by on 18 April 2014 | 5:28 pm

Using SSIS to Backup and Restore Extremely Large OLAP Databases

Working in the field of Business Intelligence I get the opportunity to work with some really large (read that as multi-terabyte) OLAP databases. Multi-terabyte OLAP databases, while not yet common place, are being seen with greater frequency and they do present a few interesting challenges to developers and administrators. Performance tuning is one of the more obvious challenges, leading to discussions related to selection of the most appropriate storage mode and how to best partition the data. Purely from the perspective of query performance, MOLAP storage is going to provide better performance than would be expected with HOLAP or ROLAP storage. At these sizes, I/O throughput is a concern and there are definite benefits to using the StorageLocation property to distribute partition data across multiple disks. A less obvious aspect of performance tuning involves the ability to backup and restore these large databases before Mr. Murphy applies his law and it becomes necessary to recover from some form of disaster. Databases can become inaccessible for a number of reasons, including but not limited to hardware failures and BI developers with administrator permissions fully processing dimensions. The prospect of failure and the need to recover from a disaster gives administrators of OLAP databases some really good reasons to be involved in planning and testing of backup and recovery operations. For reasons I’ll explore in a bit, Analysis Services backup and restore operations on multi-terabyte databases do not occur with lightning speed. Likewise, if it becomes necessary to execute a full process of a multi-terabyte database there’s a pretty good chance that it’s going to take more than just a few hours. Just imagine the fun of explaining to the company CEO, CFO, and CIO that you’ll have their production database back online in a couple of weeks. Therefore, the ability to restore a really big database to a functional state within a reasonable period of time is probably more important than the ability to create a backup in a reasonably short period of time. At least the discussion with the CEO, CFO, and CIO will be substantially less painful if you can say something to the effect of “We should have the database fully restored in a few hours.” I recently had the opportunity to work with a customer who had a 35 hour window in which to create a backup of an OLAP database that occupied nearly three terabytes on disk. Because of the size of the database, the partitioning strategy involved distributing the data across eight separate LUNs. The reason for the 35 hour window was that a processing job was scheduled to begin execution on Sunday evenings at 11:00 PM. Because the database was in use, the backup job was scheduled to begin on Saturdays at noon. The problem was that the backup job would be terminated after 35 hours when the scheduled processing operation began executing. Since this was a production system, it was absolutely essential that the customer have a database backup and a plan to restore the data in the event of a disaster. In the absence of a database backup, the alternative was to have the database remain offline for a period of nine (9) days to allow the database to be fully processed. So what are the options for implementing backup and recovery with very large OLAP databases as well as the drawbacks of each approach? Let’s take a look the following options: SAN Snapshot Backup and Restore Re-deploy and fully process Synchronization   1.     SAN Snapshot: This is an option that has been explored in several scale-out scenarios solely for the purpose of moving metadata and data files from a dedicated processing server to one or more query servers. (see Carl Rabeler’s whitepaper entitled “Scale-Out Querying with Analysis Services Using SAN Snapshots” http://www.microsoft.com/en-us/download/details.aspx?id=18676).  SAN Snapshots are fantastic for backing up one or more LUNs for storage locally or at another site. There are essentially two types of SAN Snapshot. The While this is greatly simplified, it’s sufficient to know that a Copy-on-Write snapshot creates a snapshot of changes to stored data and affords rapid recovery of data. A Split-Mirror snapshot references all of the data on a set of mirrored drives. A Split-Mirror snapshot grabs a snapshot of the entire volume, which simplifies the recovery process. There are, however, some issues inherent in using SAN Snapshots to create “Backups” of databases.    Full recovery using a Copy-on-Write snapshot requires that all previous snapshots be available. Creation of Split-Mirror snapshots tends to be slower than Copy-on-Write snapshots and the storage requirements tend to increase over time. In either case, the storage requirements eventually would become rather onerous since there is no compression. Another consideration is that in order to generate a recoverable snapshot, all of the I/O on the server would have to be stopped. Add to that it’s very likely that the SAN Snapshots will fall to the purview of someone known as a SAN or Storage administrator and not a DBA. (Somehow, I rather doubt that the SAN or Storage Admin will be catching a lot of heat when a critical database goes belly up.) An important consideration is that each SAN vendor implements SAN Snapshots in a somewhat different manner. In this case a SAN Snapshot was merely an academic discussion given that full snapshots of several volumes, none of which were mirrored, would have been required.  2.       Backup and Restore. This is the most readily available approach to backing up and restoring an Analysis Services database as the Backup/Restore functionality is readily available in the product. This functionality can be readily accessed using the GUI in SQL Server Management Studio, using XMLA commands like the following to create a backup: <Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">   <Object>     <DatabaseID>Adventure Works</DatabaseID>   </Object>   <File>C:\PUBLIC\AWDEMO.ABF</File> </Backup>  with the following XMLA command to restore: <Restore xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">   <File>C:\PUBLIC\AWDEMO.ABF</File>   <DatabaseName>Adventure Works</DatabaseName> </Restore> implemented via the AMO API using code similar to the following to create a backup: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.AnalysisServices;   namespace BKUP {   class Program   {     static void Main(string[] args)     {       Server asServer = new Server();       asServer.Connect("localhost");       Database asDB = asServer.Databases.FindByName("Adventure Works");       asDB.Backup("C:\\PUBLIC\\ASDEMO.ABF");       asServer.Disconnect();       asServer.Dispose();     }   } } and the following AMO code to restore the database using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.AnalysisServices;   namespace RESTORE {   class Program   {     static void Main(string[] args)     {       Server asServer = new Server();       asServer.Connect("localhost");       asServer.Restore("C:\\PUBLIC\\ASDEMO.ABF","Adventure Works",true);       asServer.Disconnect();       asServer.Dispose();     }   } }  All three are relatively straight forward approaches to backup and restore, using functionality that is built into the product and make this option very appealing. However, there are a few aspects of the native Backup/Restore functionality that make it problematic with extremely large databases. One factor that becomes an extremely important consideration is that the processes for Analysis Services Backup and Restore operations are fixed at three (yes, you read that right as 3) threads. The net result is that the native backup/restore operations are roughly comparable in performance to standard file copy operations. While it would be a nice feature to have, Analysis Services doesn’t have functionality similar to the Differential Backups that are available in the SQL Server database engine. In this case, it was known that the database backup was being terminated without completing at 35 hours, so this was obviously not an option. Even if native backup had been an option, we knew that the restore operation would require more than 35 hours making this a non-viable option.   3.       Redeploy and fully process. This is obviously one solution, which requires nothing more than having either a copy of the database, in its current state, as a project or the XMLA script to re-create the database. On the positive side, the metadata would be pristine. One slight problem with this approach is that fully processing a multi-terabyte database is typically going to require multiple days, if not weeks, to complete. The amount of time required to fully process a large mission critical database is probably not going to make this an acceptable approach to disaster recovery. In this particular case, fully processing the database would have taken in excess of nine (9) days to complete, so that discussion with the CEO, CFO, and CIO would have been something less than pleasant and cordial.   4.       Use Synchronization: Synchronization is another functionality that is natively available in Analysis Services. The product documentation indicates that it can be used to either deploy a database from a staging server to a production server or to synchronize a database on a production server with changes made to the database on a staging server. In any event, the Synchronize functionality does allow an administrator to effectively create and periodically update a copy of an Analysis Services database on another server. This copies the data and metadata from a source server to a destination server. One of the benefits is that the database remains accessible on the destination server, affording users the ability to continue executing queries on the destination server until the synchronization completes and queries are then executed against the newly copied data. Much like Backup/Restore, the Synchronization functionality is readily available in the product and can accessed using the GUI in SQL Server Management Studio or using XMLA commands like the following to the following:  <Synchronize xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">   <Source>     <ConnectionString>Provider=MSOLAP;Data Source=MyServer;Integrated Security=SSPI</ConnectionString>     <Object>       <DatabaseID>Adventure Works DW 2012 - EE</DatabaseID>     </Object>   </Source>   <Locations />   <SynchronizeSecurity>SkipMembership</SynchronizeSecurity>   <ApplyCompression>true</ApplyCompression> </Synchronize> Using Synchronization for the purpose of creating a “backup” from a production system requires that a destination server that is at least the same service pack level as the Source production server (ideally one would want the identical build number). Using Synchronization also requires that the server being used as the destination have storage capacity equivalent to the source (in this case Production) server. The only server available as a possible destination server was the development server that was being used to make, test, and then push modifications to the design of the database on the production server. For some strange reason, the team doing the database development/modification work had some pretty strong reservations about overwriting the work they were doing in the development environment. Those were the options that were considered and unfortunately, for one reason or other, none were acceptable. That meant that it was time to start getting creative. I knew that Care Rabeler had done some work with copying databases for a scale out solution, but that was using SAN Snapshots. I was also very aware of a Technet article by Denny Lee and Nicholas Dritsas (http://technet.microsoft.com/library/Cc966449) related to a scale out solution using a SQL Server Integration Services (SSIS) package with Robocopy to copy metadata and data files from multiple databases from a processing server to the data folders of a group of query servers. Armed with an idea and some information (I know, it’s a dangerous combination) related to using a single instance of the multi-threaded version of Robocopy, it seemed like the beginnings of a pretty tantalizing solution. Rather than copy the entire data directory, all that was really necessary was to copy the data for a single database. The initial plan was to detach the database, use Robocopy to move the data to a “safe” storage location and then re-attach the database. Sounded simple enough, except for a slight complicating factor. The database was nearly three terabytes in size and the data were distributed across eight LUNs. Detaching and re-attaching an Analysis Services database is a relatively trivial matter that can be easily accomplished from SSMS, but since this was a job that should be scheduled to run on a weekend there was a strong desire to automate the process as much as possible. Building on the prior use of SSIS with Robocopy by Denny Lee and Nicholas Dritsas, it was decided to use an SSIS package to contain and automate the entire process. This had several advantages. 1.       This would allow the database to be detached then on success of that operation begin the copy operation 2.       Since the data were distributed across eight drives, it would be possible to execute eight instances of Robocopy in parallel (one instance for each drive containing data). 3.       Since only the data on each drive was required, it wasn’t necessary to copy the contents of the entire drive which allowed copying a single directory and the subdirectories it contained. 4.       Since there were eight LUNs from which data were being copied, it made sense to copy data to eight separate LUNs on a storage server to avoid significant disk I/O contention on the target server. 5.       The on completion precedence constraints on the robocopy tasks could be combined with an AND condition so that the database would be re-attached only after all of the data had been copied to a storage location. A very simple command line utility that could be used to detach or attach a database was really all that was required. Since there wasn't such a utility readily available, it was time to put on the developer hat and start slinging  little bit of code. That effort resulted in the following application code: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.AnalysisServices;  namespace DropAdd {     class Program     {         static int Main(string[] args)         {             int returnval=0;             switch (args.Count().ToString())             {                 case "2":                     {                         string servername = args[0].ToString().ToUpper().Trim();                         string databasename = args[1].ToString().ToUpper().Trim();                         ServerApp DetachIt = new ServerApp();                         returnval = DetachIt.Detach(servername, databasename);                         break;                     }                 case "3":                     {                         string servername = args[0].ToString().ToUpper().Trim();                         string filepathname = args[1].ToString().ToUpper().Trim();                         string databasename = args[2].ToString().ToUpper().Trim();                         ServerApp AttachIt = new ServerApp();                         returnval = AttachIt.Attach(servername, filepathname, databasename);                         break;                     }                 default:                     {                         Console.WriteLine("Incorrect number of parameters");                         Console.WriteLine("dropadd server_name database_name");                         Console.WriteLine("dropadd server_name file_path database_name");                         Console.ReadLine();                         returnval = 0;                         break;                     }             }             return returnval;         }     }       class ServerApp     {         public int Attach(string ServerName, string FilePathName, string DatabaseName)         {             Server asServer = new Server();             int outcome = 0;             asServer.Connect(ServerName.ToString().Trim());             try             {                 Database AsDB = asServer.Databases.FindByName(DatabaseName.ToString().Trim());                 if (AsDB != null)                 {                     outcome = 0;                 }                 else                 {                     asServer.Attach(FilePathName.ToString().Trim());                     outcome = 1;                 }             }             catch (Exception goof)             {                 outcome = 0;             }             finally             {                 asServer.Disconnect();                 asServer.Dispose();             }               return outcome;         }           public int Detach(string ServerName, string DatabaseName)         {             Server asServer = new Server();             int outcome = 0;             asServer.Connect(ServerName.ToString().Trim());             try             {                 Database AsDB = asServer.Databases.FindByName(DatabaseName.ToString().Trim());                 if (AsDB != null)                 {                     AsDB.Detach();                     outcome = 1;                 }             }             catch (Exception goof)             {                 outcome = 0;             }             finally             {                 asServer.Disconnect();                 asServer.Dispose();             }             return outcome;         }     } }   Using that code, all that was necessary to detach a database was execute the DropAdd command line utility, passing the Server Name and Database Name as parameters. When it became necessary to attach a database, it was just a matter of executing the DropAdd command line utility passing the Server Name, File path to the database, and Database name as parameters. Having addressed both detaching and re-attaching the database, it was necessary to consider how to best use Robocopy to move the data from the production server to a storage location. A small scale test using robocopy with the default threading option of 8, worked reasonably well. But since the design of the database distributed data across eight LUNS, it would be necessary to execute robocopy once for each LUN on which data were stored. Running eight instances of robocopy in serial would be a bit time consuming and quite honestly it was suspected that doing so would run well past the 35 hour window for backup creation. An associated problem was determining when the last instance of Robocopy had completed execution. That lead to a decision to execute eight instances of Robocopy in parallel.   The result was the design of an SSIS package looking something like the following: The SSIS package simply consisted of a set of 10 Execute Process tasks, with the following components: Detach Database                Detach the database from the server Robocopy Data Dir              Copy the data from the database directory Robocopy G                     Copy data from the Data Directory on the G drive Robocopy H                     Copy data from the Data Directory on the H drive       Robocopy I                     Copy data from the Data Directory on the I drive Robocopy J                     Copy data from the Data Directory on the J drive Robocopy K                     Copy data from the Data Directory on the K drive Robocopy L                     Copy data from the Data Directory on the L drive Robocopy M                     Copy data from the Data Directory on the M drive Attach Database                Attached the database on completion of the copies   For the “Detach Database” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\CustomApps\DroppAdd.exe Arguments MyServer “My Big Database” FailTaskIfReturnCodeIsNotSuccessValue True SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Success   In order to ensure that the data could be copied to a “safe” storage location, it is absolutely essential that the database be detached from the server in order to prevent write operations from processing which could result in files on the destination storage location becoming corrupt.   For the “Robocopy Data Dir” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\My Big Database.17.db" "\\StorageServer\e$\Main" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy G” task, the following properties were set Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "G:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\G$\G_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy H” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "H:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\H$\H_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy I” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "I:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\I$\I_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy J” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "J:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\J$\J_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy K” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "K:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\K$\K_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy L” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "L:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\L$\L_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Robocopy M” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "M:\Program Files\Microsoft SQL Server\ OLAP\Data " "\\StorageServer\M$\M_Drive" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Attach Database” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\CustomApps\DroppAdd.exe Arguments MyServer "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\My Big Database.17.db" “My Big Database” FailTaskIfReturnCodeIsNotSuccessValue True SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint None   This all assumed that the account that would be executing the job had Administrator permissions for the Analysis Services service, as well as sufficient permissions to access files on each of the drives that contained data. It also required that the account have sufficient permissions to write to the destination drives that were being used to store the files of what would become an uncompressed backup. It seemed prudent to compare performance with the built-in backup functionality, so being an intrepid soul, I decided to test it out with a version of the Adventure Works database which had been modified to distribute partitions relatively evenly across eight logical drives. The native Backup functionality required 55 seconds to create the backup on an 8 core machine with 24 Gb of RAM. Feeling very confident with the newly minted solution, it was a bit disappointing to find that it took right at 53 seconds to create the backup. However, since Robocopy can be used to copy only changed files, it was decided to process three or four partitions then run the comparison test again. This time, backup again required 55 seconds but the SSIS solution completed in 11 seconds. A good indication that even though a full “backup” of the multi-terabyte database may not be achieved on the first execution there was an extremely good chance that we would have a complete copy following a second execution of the SSIS package. That meant it was time for the acid test to see how well this solution would perform in the production environment. When the backup window opened, the SSIS package was executed. Approaching the 35 hour mark, the SSIS package had not yet completed execution so it was decided to terminate the package and run the “Attach Database” task to re-attach the database. Somewhat disappointing, but it was encouraging to find that the formerly empty E drive now contained approximately 2.5 Terabytes of data so it was not a total failure. On that basis, it decided to leave the solution in place and allow it to run during the next “backup” window. When the next backup window opened, the SSIS package began executing, and it was extremely encouraging to find that it completed in seven hours. Checking the E drive, it now contained nearly three terabytes of data. The first thought was “SUCCESS” and now it’s time for a nice cold beer. Of course the second thought was something to the effect of “OK, what happens when one of the disks goes belly up or one of the developers does a full process on one of the dimensions.” Followed by “we have a Backup solution and an uncompressed Backup but no way to restore it.” Time to go back to work to build another SSIS package that could be used to restore the database.  But since we had a “backup” solution, the restore would be simple. It was just a matter of reverse engineer the “backup” solution. This task, however, would be simpler since we would be able to recycle the logic and some of the bits used to create the “backup” solution. It was known that the DropAdd code could be re-used to detach and attach the database. It was also a relatively trivial matter to simply change the order of the parameters passed to the tasks that executed robocopy. Designing a process to restore the database presented a new challenge, in the form of “What happens in the case of a total system failure and it becomes necessary to restore the database to a new but identically configured server?” That would require creating a directory that would contain the database. The result was an SSIS package similar to what you see below:   The "Restore" SSIS package consisted of a set of 1 File Connection Manager, 1 File System Task and 10 Execute Process tasks, with the following components: MyFileConnection                           File Connection Manager Create Database Directory                  Create the Database Directory if it did not exist Detach Database                            Detach the database from the server Restore Main Data Directory                Copy data from backup to the database directory Restore drive G Data                       Copy data from backup to the G drive Restore drive H Data                       Copy data from backup to the H drive    Restore drive I Data                       Copy data from backup to the I drive Restore drive J Data                       Copy data from backup to the J drive Restore drive K Data                       Copy data from backup to the K drive Restore drive L Data                       Copy data from backup to the L drive Restore drive M Data                       Copy data from backup to the M drive Attach Database                            Re-attach the database on completion of the copies   For the “MyFileConnection” File Connection Manager set the following properties on the Process tab: Property                              Value UsageType Create Folder Folder C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\My Big Database.17.db     For the “Create Database Directory” task, the following properties were set on the Process tab: Property                              Value UseDirectoryIfExists True Name Create Database Directory Description File System Task Operation Create Directory IsSourcePathVariable False SourceConnection MyFileConnection WindowStyle Hidden   Precedence Constraint Success   For the “Detach Database” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\CustomApps\DroppAdd.exe Arguments MyServer “My Big Database” FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Success   In this case, it wasn’t really desirable to require success of the detach operation, given that one possible scenario was that the database had never existed on the server and the database was being restored to a “clean” environment.   For the “Restore Main Data Directory” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\e$\Main" "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\My Big Database.17.db" /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore drive G Data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\G$\G_Drive" "G:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore drive H Data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\H$\H_Drive" "H:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore drive I Data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\I$\I_Drive" "I:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore drive J data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\J$\J_Drive" "J:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore Drive K Data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\K$\K_Drive" "K:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore drive L Data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\L$\L_Drive" "L:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Restore drive M Data” task, the following properties were set on the Process tab: Property                              Value RequiredFullFileName True Executable C:\Windows\System32\Robocopy.exe Arguments "\\StorageServer\M$\M_Drive" "M:\Program Files\Microsoft SQL Server\ OLAP\Data " /S /PURGE FailTaskIfReturnCodeIsNotSuccessValue False SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint Completion   For the “Attach Database” task, we set the following properties on the Process tab: Property                              Value RequiredFullFileName True Executable C:\CustomApps\DroppAdd.exe Arguments MyServer "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\My Big Database.17.db" “My Big Database” FailTaskIfReturnCodeIsNotSuccessValue True SuccessValue 1 TimeOut 0 WindowStyle Hidden   Precedence Constraint None   For due diligence, it was decided to test the “Restore/Disaster Recovery” package using the same version of Adventure Works that was used for initial testing of the “Backup” package. It was not entirely surprising that copying and attaching the database in that scenario was comparable to restoring from a backup. To test a recovery scenario, it was decided to create a new “backup” using the backup SSIS package, execute a ProcessFull on the Customers dimension and then run the SSIS package to restore the database. It was very encouraging to find that the database was restored to full functionality in roughly 10 seconds. It didn’t take a long time to receive a painful reminder of the reason that a disaster recovery strategy is important, especially with extremely large databases. Shortly after both the “Backup” and “Restore” SSIS packages were completed, one of the developers on the team managed to accidentally execute a ProcessFull on one of the dimensions used in all of the cubes contained in the database. At this point, there was a choice to be made. Fully processing the database would require a minimum of 9 days, and quite probably longer than that. The “Restore” SSIS package had undergone limited testing but the testing that had been done was extremely encouraging. Ultimately, the “Restore” SSIS package was run and roughly eight (yes, 8) hours later a fully functional multi-terabyte production database was back online.

Posted by on 18 April 2014 | 5:05 pm

Every Developer Has a Story

Between April 16 and June 1, developers can enter by joining App Builder Rewards and completing the "Every Dev Has a Story" quest with 50-200 words about an application published in the Windows or Windows Phone Store, or an Azure application or developer story that includes: What did you build and why? What inspired your technical and problem-solving genius? Which Microsoft tools and features did you use? Developers can submit up to five unique stories that will be judged on: technical genius and innovation (50%), promotional tactics (20%), monetization (15%) and human interest (15%). Prizes: · All participants receive 250,000 Ad Duplex impressions ($400 value) for submitting a story. (Max of 1 $400 Ad Duplex package per dev) Grand prize winners (15) receive a Dell Venue 8 Pro ($329 value) and an additional 1 Million Ad Duplex advertising impressions ($1,000 value). For more information, check out https://build.windowsstore.com/rewards

Posted by on 18 April 2014 | 4:23 pm

How to disable OneDrive file syncing

I love OneDrive and use it every day. It’s tied to my Microsoft Account, which I use to download Windows Store apps. Data from modern apps are synchronized across all of my PCs, so I can start a task on one PC and finish it on another. OneDrive syncs my PC theme, background, WiFi passwords, IE favorites, and more. My Windows Phone automatically uploads pictures to OneDrive so I can share just a link to a photo and don’t have to send it as a large attachment. You can even do this with iOS and Android apps. In short, I can’t live without it. However, I want to use it a bit differently on my work PCs, which are bound to an Active Directory domain. Microsoft has two related products: OneDrive and OneDrive for Business. I use OneDrive for Business to sync files on my work PCs. I don’t want to see my regular OneDrive files there, because I don’t want to accidentally save a work file on my personal OneDrive. However, I still want all the other benefits of OneDrive and my Microsoft Account on my work PCs. Here’s how I disabled OneDrive file storage. Only follow them if you’re absolutely comfortable with them and fully understand what you’re doing. From the Windows 8.1 Start Screen, click the Search icon, use the Search charm, or go there directly by hitting the Windows key and Q. Search for gpedit.msc. Run it. Expand Computer Configuration –> Administrative Templates –> Windows Components Select OneDrive On the right hand side, double click Prevent the usage of OneDrive for file storage Select the Enabled radio button When complete, gpedit.msc will look like this: Some tips: Allow time for the setting to take effect. Group policies are applied every 90 minutes, plus or minus a random amount up to 30 minutes. To apply the policy immediately, run gpupdate from the command prompt. It’s best to do this before adding a Microsoft Account to a PC in an Active Directory domain. This minimizes the amount of cleanup you have to do later. This only works on Windows 8.1 Pro and Windows 8.1 Enterprise editions. Only the Pro and Enterprise editions have group policies. I haven’t tested this on corresponding Windows 7 editions. Future versions of Windows after 8.1 might behave differently. Very important: don’t delete previously synced OneDrive files until this group policy takes effect. Otherwise, if you delete files from your PC they will also be deleted from OneDrive in the cloud. The group policy has taken effect when you no longer see OneDrive in the left pane in File Explorer. Also, open Task Manager and look for a Background Process called OneDrive Sync Engine Host. If you see it, the group policy has not yet been applied. Proceed with extreme caution when deleting files after the policy has taken effect. Try putting a single file in the Recycle Bin and triple check http://onedrive.com to make certain it is not simultaneously deleted from OneDrive. Don’t empty your Recycle Bin until you’re absolutely certain your files are still intact on http://onedrive.com. It’s easy to mess this up. Don’t follow any instructions in this blog post if you’re not 100% comfortable with what you’re doing. If you have any questions, please try the OneDrive support forum.

Posted by on 18 April 2014 | 3:29 pm

Top 10 Microsoft Developer Links for Friday, April 18, 2014

Brian Harry: Creating installers with Visual Studio Visual Studio Blog: Visual Studio Installer Projects Extension Dr. Dobbs: Microsoft Python Tools for Visual Studio 2.1 Beta Kevin Aguanno: Three Large Banks, Three Different Approaches to Agile Adoption VBTeam: Visualizing Roslyn Syntax Trees Carlos Quintero: Debugging .NET Framework now working Windows Apps Team: //build 2014 highlights #2: enterprise, XAML, and IE/JavaScript/web apps Sara Itani: Scalar Properties and Collection Properties and Screens! Oh My! - An Overview of Screen Property Actions Bruno Terkaly: The DevOps Story - Why it is really about Platform as a Service Brynte: Windows Phone 8.1 for Developers–Contracts 2017

Posted by on 18 April 2014 | 1:45 pm

PerfBytes Podcast Live from STPCon

Episode #41 PerfBytes Live! at STPCON Spring 2014 UTest is renaming itself to Applause 4:30 Talking about yours truly (Thanks guys!).  Mark described Data-Driven Quality (DDQ) as “Using a phenomenological understanding of the real world” 6:40 revisiting the argument about “schools of testing” 12:50 Shift Left 14:50 News of the damned 28:00 Audience participation – What have you learned at STPCon? 33:00 John Montgomery like the way I tweeted about the Rex Black/Cem Kaner “debate” - @mtomlins O so this is about CDT vs. ISTQB. perhaps they should say that. I'm more interested in business value thru effective engineering— Seth Eliot (@setheliot) April 15, 2014 #stpcon Key1 I may not understand the issues, but they seem irrelevant . Engineers at Microsoft Amazon Facebook Twitter Netflix do not care— Seth Eliot (@setheliot) April 15, 2014

Posted by on 18 April 2014 | 1:34 pm

Using keyboard shortcut F3 to save steps

As we were training new individuals on Management Reporter a few weeks ago, we were stepping through creating a total row for total sales.  We were asked how we got the format code TOT to show in column C, Format code. It's a common question and one we have to explain quite often. To get to the many fields and drop downs in Management Reporter, you have to double click. So, this blog is all about giving you the one keyboard shortcut that Jill, one of our program managers, shared with the group. (I didn't even know about the shortcut which is embarrassing to share, but since then I constantly use it everywhere in the product.) If you are ever on a field and want to see the options, you can simply hit F3 instead of double-clicking and the options are presented to you.  Hopefully you can start using this to help discover the many options Management Reporter has for creating innovative reports as well as you save some time.  

Posted by on 18 April 2014 | 1:12 pm

Inspiring female hackers in Brazil

Inspiring female hackers in BrazilAdvocate for women in computer science programs, Professor Rosiane de Freitas mentored participants from Brazil’s Federal University of Amazonas in the 2013 Women’s Hackathon. Their successes inspired more young women to participate in the 2014 event. ...(read more)

Posted by on 18 April 2014 | 1:00 pm

AD RMS SDK 2.1 Performance and Documentation Improvements - April 2014 Update

Howdy Folks, We’ve been following up on some engineering updates and your requests to make even better the developer experience for Information Protection on Windows desktop. Our April update has performance and documentation improvements that we think you’ll appreciate. Here’s Bruce Perler, our developer docs writer, to outline the April update. Thanks,Dan … Happy Friday! This update to the AD RMS SDK 2.1 has several important improvements and some needed reorganization and clarification of the setup, testing and application promotion guidance. What's new in the April 2014 update File API memory usage, especially for large PFile has been improved significantly. Content ID is now writable via the property IPC_LI_CONTENT_ID. For more information, see License property types. Production manifest requirement - When your RMS enabled application/service is being run in server mode, we will not require a manifest anymore. For more information, see Application types. Documentation updates Reorganized - How-to use to clarify the order of steps for environment setup and application testing. Testing best practice - guidance added for use of on premise server before testing with Azure RMS. For more information, see Enable your service application to work with cloud based RMS.   Thanks, Bruce PerlerSr. Programming Writer

Posted by on 18 April 2014 | 12:52 pm

Pie in the Sky (April 18th, 2014)

It's pretty quiet this week, so not a lot of links. Maybe everyone is off at the beach for spring break? Here are a few links for those of us stuck in front of computers this week. Cloud Azure is the only cloud service to comply with European privacy laws: Now if we can only fix US privacy laws to allow actual privacy. How the Azure Mobile Services .NET backend works: A collection of links about the .NET backend. Client/mobile FileSaver.js: An HTML5 saveAs() FileSaver interface. Vector Layout Expressions: A declarative layout for your SVG. How to test browsers on virtual machines from Modern.IE: Using the VMs provided by Modern.IE. UX delima- Red button vs. Green button: Figuring out which colors to use. .NET Azure storage blobs as directory: Working with blobs as if they had directories. Humanizer: Humanize your data. Node.js Controlling an Arduino with Node.js and Johnny-Five: What can't you program with JS? Misc. Getting started with Go: Maybe your next language? Open Source is a thankless job: Pretty much. Office Wikipedia application is on GitHub: I didn't even know there was an Office Wikipedia application. Nginx and the heartbleed vulnerability: Basically, is your Nginx dynamically or statically linked to OpenSSL? Adjust course from there. PowerShellJS: Use JavaScript inside PowerShell Please put OpenSSL out of its misery: This is more like "please give us a better security solution than our current one". Intelligent Systems Service: More details on how Azure can fit into the Internet of Things. Enjoy! -Larry

Posted by on 18 April 2014 | 11:50 am

Looking for some Unity?

Good news - there might be a "Day of Unity" coming to a city near you! Unity and Microsoft are running free day-long events aimed at helping you write Unity games on Windows devices. Did I mention it's free? Just register and then turn up with your laptop (Windows or - gasp - Mac) and you'll get all the help you need to port your game or write a new one from scratch. There are events all over the US in April and May, and also happenings in China and UK. Link: Day of Unity Plenty of MS people will be there.

Posted by on 18 April 2014 | 11:37 am

Listing all the IP Addresses used by VMs

Here is a neat little snippet of PowerShell: Get-VM | ?{$_.State -eq "Running"} |  Get-VMNetworkAdapter | Select VMName, IPAddresses If you run this on a Hyper-V Server it will give you a listing of all the IP addresses that are assigned to running virtual machines: This works whether you are using DHCP or Static IP addresses – and can really help when you are trying to track down a rogue virtual machine. Cheers, Ben

Posted by on 18 April 2014 | 11:14 am

Middle Georgia State College Holds Gaming Workshop

Yesterday I had the opportunity to go to the Middle Georgia State College to have a full day presentation. We were covering game development with Construct 2 and Unity 3D. In the middle of the presentation I had the surprise to be introduced to Sitarah Coote from the WMAZ News Channel.  She did a great job covering the event and shared with us this post: http://www.13wmaz.com/story/news/local/macon/2014/04/17/middle-georgia-state-college-gaming-workshop/7841847/   You can check the video and have a good time and see me go completely fan boy talking about video games and game development in general. The people at GMSC were great, not only we have a good number of attendants, but I could tell during the workshop that they were genuinely interested in the different tools and had their minds racing with new ideas and projects. It was also quite gratifying to see several high schoolers and dad/sons couple that were loving the entire experience, and you can see a couple of them during the video. I can definitively say that the southern hospitality was in the air and I had a blast talking about one of my favorite subjects: Gaming.   Thanks to all the faculty and the students that gave me such a warm welcome and hope to see you again really soon.  

Posted by on 18 April 2014 | 11:07 am

Power Map April update for Office 365 now available and Preview expiration removed in Office 2013

The Power Map team is excited and proud to bring you the April update of Microsoft Power Map for Excel. This update brings new functionality to Power Map and improves some existing features. Power Map users with an active Office 365 subscription will receive this update through Office 365 Click-To-Run if they have automatic updates enabled. Before we detail the updates, the team is also happy to announce a change for customers who have not yet moved to Office 365 but still want to continue using Power Map for geospatial data analysis in Excel: Power Map Preview Expiration Removed Based on feedback from our customers and community, the Power Map Preview add-in will no longer expire on 5/31/2014. We will make this add-in available for all versions of Office 2013 and Excel 2013 standalone. Please note preview features are not supported and we do not encourage the preview version be used in production. A supported version of Power Map is available as part of the Office 365 subscription today and only supported versions of Power Map will receive feature updates moving forward. Power Map will also be added to Excel in the next version of Office for customers purchasing Office under a perpetual licensing agreement.    The extended Power Map Preview add-in will be available on the Download Center in May and we will announce availability on the Power BI Blog, Office.com and TechNet. We thank you for your feedback and usage of the Power Map preview. Here’s some detail on what the April update has in store for subscription customers: Add Sound to Power Map Videos  While Power Map recently added the ability to export your tours in the form of a video, there was no easy way to add a soundtrack to the video. Usually, you’d need to download some other software and add it yourself, which wasn’t much fun. We noticed many of our users were sharing their videos online, but like silent films, they had no accompanying soundtrack. While we love The Artist as much as the next person, we think adding sound to your video makes it more powerful as a story-telling tool and should be something you can do from right inside Power Map. In the April update, we launched a new feature that allows you to add audio to any tours you export as video from Power Map. Simply choose an audio file from your computer and we’ll add it to your Power Map video. We also offer some convenient features to make your audio file a perfect fit with your Power Map tour. For example, you can easily see how long your soundtracks are and choose one that matches the length of your video. Don’t have a soundtrack that fits perfectly? No problem, as you can let Power Map loop the soundtrack for you. Power Map can also fade the soundtrack in and out for you to provide for a gentler introduction and closing. That’s all you need to do to add sound to your Power Map tours! It’s that simple and, best of all, you can do it all from within Power Map. Geocoding Improvements We worked with the Bing team to provide a more accurate geocoding of major cities worldwide when there are no other geo-fields in your data. The improvements will be apparent if you have ever tried to geocode a city like 'Paris' without 'France' in the same row. This type of geocoding is now orders of magnitude better than what it was before thanks to some very interesting work in complex disambiguation and ranking algorithms done by the Bing team.  In addition to these two features, the April update to Power Map also features a collection of bug fixes and performance improvements.  We’re always working hard to improve Power Map, and you can help us by providing your feedback. If you have a feature suggestion, comment, or question, let us know on the Power Map forums, or through the comments section below. We’d also love to see any videos you’ve created (with and without sound), so share them using the comments section!   Tushar DhootProgram Manager Intern, Power Map TeamApplications Services Group Learn more Try Power BI See Power BI in action Download the Power BI App Follow @MicrosoftBI on Twitter to learn more about how you can #ExpressUrCells!  

Posted by on 18 April 2014 | 11:00 am

Walkthrough – Installation of Release Management for Visual Studio 2013 Server.

Before you plan to install Release Management Server or any of its component please go thru the Install guide to ensure that you meet the requirement You can download the Release Management Installation Guide from here . In this walkthrough I already have install SQL 2012 RTM on the server which would be hosting my Release Management database. Permission required for the installation (for the account which would be installing and configuring Release Management for Visual Studio 2013 Server) ...(read more)

Posted by on 18 April 2014 | 10:41 am