With the launch of HappenZap, I now have two multi-tenant mobile app platforms running on Azure App Service.  When it comes to the backend services for mobile apps, Azure App Service really isn’t used that much though.  In fact, in the 2017 Ionic Developer Survey, Azure only accounted for 10% of the users using it as a server side platform (behind Heroku, Digital Ocean, and Amazon ECS).  For authentication, it ranked even lower at only 2.9% of the survey results.  For Push Notifications, it wasn’t even on the list.  However, for both my platforms, I have chosen to use it and have been doing so successfully.

Why did I go with Azure App Service?  When I was getting started with mobile development, I found Azure as an option pretty quickly.  Having a lot of experience around the Microsoft stack, I found that App Service was something I could get going with.  Maybe this was partly due to my lack of experience with mobile, but I chose to go this route and I am pretty happy with a lot of it.  Most mobile developers I have ran into don’t even consider Azure App Service as an option, but I think it’s worth a look.

Let’s look at some of the different aspects.

Database

Azure App Service offers Easy Tables and they are in fact easy.  They are awesome for prototyping because you don’t even have to define a schema (although I always do in my apps).  In fact, you can basically just insert anything and if the column doesn’t exist in your table, it will create it for you.  it automatically creates an id column, a createdAt, modifiedAt, version number, and deleted fields for you as well.  It supports a soft-delete capability that you can easily turn on as well.  From a developer stand point, it’s easy to get started with a simple API around your database tables using node.js.

From a cost perspective, this is where you want to plan.  Even the cheapest Azure SQL database costs you $5.  For a service I am charging $40 a month for, having a separate database for each customer is not cost effective.  As a result, I put all of my customers in the same database and every table is segmented by a tenant_id column.  This works, but that means you have to write a level of security into your API.  We’ll talk more about authorization in a bit, but this means you have to validate that the user making the API call has permission to make queries into that tenant.

Azure Web Apps / Mobile Apps

Whether you create a new “Mobile App” or “Web App”, it’s basically the same thing with a different icon in the Azure Portal.  When thinking multi-tenant, your goal is to create one of these that can serve all of your clients.  If each client has to have their own Web App, you will quickly exceed the memory capacity of your App Service plan.  There are reasons why you might consider having more than one though as you will see when you read on.

Authentication

Authentication in mobile apps with Azure App Service is easy due to the large number of SDKs available including Cordova, Xamarin, and native iOS and Android.  App Service supports authenticating against Azure Active Directory, Facebook, Google, Microsoft, and Twitter.  If you want to authenticate against Microsoft consumer or AAD, App Service is a great option because Firebase doesn’t support it.  You can authenticate against one or all of the providers too.  Logging a user in as simple as calling azureMobileClient.login(‘providername’).  On mobile apps, it will open the InAppBrowser and sign the user in.  This works great for interactive logins but auto-login with the token is a bit of a different story.  I’ll post in the near future about how to do that as it is not well documented.

The way authentication works is that your mobile client makes a call into the App Service back in which in effect proxies the request over to the appropriate provider.  The nice thing about this is that, any subsequent API calls you make automatically pass the user’s token and the API can respond accordingly if the user is not authenticated or their token has expired.  If you don’t need authentication on a particular type of API call, you can allow anonymous users to access it.  For example, anonymous users might get read access, but authenticated users can insert / update / delete.

The issue you run into with authentication on app service is that all users have to request the same scope / permissions.  For example, you may want end users to have just basic profile access to Facebook, but you want administrators to be able to manage pages.  The permissions you request are set in the Azure Portal and are essentially fixed.  That means all users have to request the same permissions.  That’s no good.  One way to work around this is to call the authentication provider directly.  For example, I’ve done this with AAD to request a scope that included admin consent credentials such as Group.Read.All.  Another way to work around this is to have multiple Azure Mobile Apps configured with different permissions.  This really doesn’t scale all that well either, but could be an option for simple scenarios.  It does create a bit of overhead though since you have to push code to each one and your client side code has to know which endpoint to call.

Authorization

Azure takes care of the Authentication for you, but authorization is still up to you.  There are not a lot of complex examples out there for this, so I’ll probably write something up soon.  Your API will receive the user’s context and therefore you can get there access token and username if needed.  For authentication, I simply implement a users table which has the user’s unique id, role, and tenant id.  I first make sure that the user is in that tenant id.  Then I make sure the user has the right role for whatever operation I am performing.  It’s fairly simple, but it works.

Save Cash with Caching

You pay for every bit of data egress from Azure whether that is your API or SQL.  It can really add up too as your volume grows.  Be sure and take advantage of caching wherever you can. Cache frequent database calls at the API layer.  Cache data that doesn’t change frequently on the mobile app.  Only get data when you need it.

Push Notifications

Azure Mobile Apps supports push notifications with Google, Apple, Amazon, and a few others.  It’s pretty simple to set up and their are methods built into the node.js SDK that make it easy to set up.  However, there are a few limitations.  First, App Service doesn’t support the newer key based model used by APNS so that means you need a certificate (for both development and production) for every tenant.  The next issue is that you can only install one key per instance of an Azure Mobile App.  That means you would have to have a separate Azure Mobile App per tenant.  That doesn’t scale well at all.  I used this approach for a while but I switched over to Firebase Cloud Messaging and now I can use a single tenant.

Summary

Azure App Service is a cost effective way to run multi-tenant mobile applications.  There are factors that you have to consider, but I do think its a viable choice for hosting your mobile app’s backend. 

with no comments
Filed under: , , , ,

If you are trying out your Ionic / Cordova apps on iOS 11 with Xcode 9, you might run into an issue where the app terminates immediately.  After examining the logs, you will see that the plugin cordova-plugin-firebase is terminating the app because it cannot find the GoogleService-Info.plist file (even though it is there).   You’ll see an error like the following in your logs.

The GOOGLE_APP_ID either in the plist file 'GoogleService-Info.plist' or the one set in the customized options is invalid. If you are using the plist file, use the iOS version of bundle identifier to download the file, and do not manually edit the GOOGLE_APP_ID. You may change your app's bundle identifier to '(null)'. Or you can download a new configuration file that matches your bundle identifier from https://console.firebase.google.com/ and replace the current one.

*** Terminating app due to uncaught exception 'com.firebase.core', reason: 'Configuration fails. It may be caused by an invalid GOOGLE_APP_ID in GoogleService-Info.plist or set in the customized options.'

If this happens to you, check your version of the plugin.  On one machine, I had 0.2.4 and on the other I had 0.2.1.  It worked on 0.2.4 so removing the plugin and re-adding it fixed it for me on the affected machine.

with no comments
Filed under: , , ,

I actually use Outlook Customer Manager (OCM) quite a bit to keep track of my leads for my product BrewZap, a custom mobile app platform for breweries.  Unfortunately, it’s not uncommon to run into an error when launching it that says “Something went wrong”. 

Screen Shot 2017-09-18 at 7.23.03 PM

The problem is that sometimes this error will even occur after you close an d restart Outlook.  If that happens to you, then open up Internet Explorer (yes IE), and go to Internet Options.  Then click on Delete under Browsing History.  Check all of the boxes and then restart Outlook.

Screen Shot 2017-09-18 at 7.27.05 PM

If all goes well, Outlook Customer Manager will start again and you can use it.again.

Screen Shot 2017-09-18 at 7.28.10 PM

You can host an Ionic 2 Progressive Web App (PWA) pretty easily on Azure Web Sites (App Service).  If you aren’t sure where to get started, take your Ionic 2 project and add the browser platform if you haven’t already.

ionic platform add browser

Now, you can test it locally by running against the browser platform.

ionic run browser

Running it on Azure really is just a matter of copying your files to your Azure Web Site via ftp.  You can get the username and address to connect to from your App Service properties.  Connect to it and be sure you change to the /site/wwwroot folder.  This is where the files from your app will go.  To will upload your files from the platform/browser/www/build folder.  Before you copy your files though I recommend you do a production build with the --prod command.  This will make the size of your JS files considerably smaller.

ionic run browser --prod

Now copy your files to the FTP site and go to the corresponding URL in your browser.  Your app should be working there. 

There are a few mime types that you need to configure so that the Ionic fonts and any other files get served by IIS properly.  You do this in by creating a web.config.

<?xml version="1.0"?>
 
<configuration>
  <system.webServer>
    <staticContent>
      <mimeMap fileExtension=".json" mimeType="application/json" />
      <mimeMap fileExtension=".eot" mimeType="application/vnd.ms-fontobject" />
      <mimeMap fileExtension=".ttf" mimeType="application/octet-stream" />
      <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
      <mimeMap fileExtension=".woff" mimeType="application/font-woff" />
      <mimeMap fileExtension=".woff2" mimeType="application/font-woff2" />
    </staticContent>
  </system.webServer>
</configuration>

If you are working with DeepLinker, you may consider using a path-based location strategy instead of the standard hash based.  This effectively removes the hash (#) symbols from all of your URLs.  However, additional configuration will be required.  That’s because IIS hosting your site in Azure will give you a 404 error when you go any of the routes you have defined.  You need to redirect your routes to index.html to work. I have found that the routes in the web.config listed below pretty well.  If you are using query strings you might run into issues with these routes though so you may need to do some additional configuration.

<?xml version="1.0"?>
 
<configuration>
  <system.webServer>
    <staticContent>
      <mimeMap fileExtension=".json" mimeType="application/json" />
      <mimeMap fileExtension=".eot" mimeType="application/vnd.ms-fontobject" />
      <mimeMap fileExtension=".ttf" mimeType="application/octet-stream" />
      <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
      <mimeMap fileExtension=".woff" mimeType="application/font-woff" />
      <mimeMap fileExtension=".woff2" mimeType="application/font-woff2" />
    </staticContent>
    <rewrite>
      <rules>
        <clear />
        <rule name="AngularJS Routes" stopProcessing="true">
          <match url=".*" />
          <conditions logicalGrouping="MatchAll">
            <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
            <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />
            <add input="{REQUEST_URI}" pattern="^/(api)" negate="true" />
          </conditions>
          <action type="Rewrite" url="index.html" />
        </rule>
      </rules>
    </rewrite>
  </system.webServer>
</configuration>

Running your PWA in Azure works a little bit differently, but once you have it configured, it’s a good solution.  If you run into any issues, turn on diagnostic logging in Azure and watch the log streaming to see what is happening.  Be on the lookout for scripts and CSS files returning a copy of index.html instead of what they are supposed to.  You can easily verify this from the developer tools of any browser.

If you have used Azure App Service, you will have loved how easy it is to set up authentication to providers such as Azure Active Directory and Facebook.  It lets you get started with a single line of code.  You can literally login with a single line of code like the following:

client.login('aad').then(results => {     // successful login 
}, error => {     // login error 
});

This will give you an id_token that you can then turn into a access_token by calling the /.auth/me endpoint with a REST call.  However, that access_token won’t have access to anything even though you configured App Service to use an App that has requested specific permissions.  CGillum from Microsoft pointed me in the right direction with his post to access the Azure AD Graph, but the Microsoft graph required some tweaks.

You start by going to the Azure Resource Explorer.  However, this assumes you have already configured your App Service app to use your particular Azure AD Application that you are creating.  Find your app service app in the hierarchy and then open /config/authsettings and click Edit.  If you haven’t set your clientSecret yet, you can do so now (although I am not 100% sure it’s required).  However, the key parameter is to set additionalLoginParams with the following JSON array. 

["response_type=code id_token",  "resource=https://graph.microsoft.com"]

This tells /.auth/me to give you the proper access_token when you call it.  You can also get a refresh token this way at the same time.  Once you have made the changes click the PUT button to send the changes back to the service.  Your should look something like this.

Screen Shot 2017-01-12 at 9.19.14 AM

Now, when you login again and call the /.auth/me endpoint, you’ll get additional data including an access token that works with Microsoft Graph.  If you have logged in before with this particular username and app, you will want to sign out and log back in again to make sure the permissions that you specified in your application get granted.  You may need to add the query string parameter prompt=consent on the login page to get it to prompt you for the new graph permissions.  Otherwise, you’ll get an access token that won’t work with the Microsoft Graph.

Screen Shot 2017-01-12 at 9.12.36 AM

As you can see in the screenshot above, the object returned has a lot more information in it than before.  There is nothing particular sensitive in this screenshot either since this is just a demo tenant. 

If you’re new to Ionic 2, you might have encountered some issues getting started debugging.  While you can usually debug fairly easily in the browser, when it comes to debugging on the device, there are some extra steps.

First, install the extension ionic2-vscode. This will give you the options in the debug menu to start the process. 

Ionic2VSCodeDebugMenu

However, before you start that, you need to make a change to package.json so that your breakpoints will be hit.  Change your “build” line to the following:

"build": "ionic-app-scripts build --dev",

The key there is to add the --dev parameter.  Now you can hit the debug button and wait.  It will take a while to startup but once it is running, the app should be on your device and your breakpoints will get hit.

If you have never deployed to your Android device before, you might try running “ionic run android” at the command line first to see if you have any other issues to resolve.

Any time Microsoft releases a new feature that has an overlap with a new feature, we see a flurry of fluff in the form of blog posts and even sessions on which feature to use when?  When Office 365 Groups came out, this was no exception.  What has changed?  At the Future of SharePoint event, Microsoft announced that every group in Office 365 will benefit from an associated Team Site.  Every Office 365 Group you create will get a new modern Team Site provisioned that shows a clear linkage to the Group.

image

That’s pretty cool and should help eliminate the confusion on what to use when since you no longer have to make a decision.  Microsoft has also stated that existing Office 365 Groups (as in the ones you have now) will also get a Team Site associated with them as well.  This means whatever you are doing now, it’s ok.  You’ll be in good shape when the new features are rolled out.

The updated Team Site home page provides a quick way to find the most important and relevant content on the site.  Content and news can be pinned to the front page and the Office Graph is baked right in to highlight activity relevant to you.  What’s even better is that you can even access it through the new SharePoint mobile app.

Another exciting change from the new Team Site experience is that they will be provisioned faster.  Whereas, it used to take several minutes to provision a site collection, now it should only be a matter of seconds. 

Between the Office 365 Group, the new team site home page, and existing team sites, we haven’t seen quite how all of this ties together yet.  It will be interested to see where things go.

This isn’t out yet, where should I put content right now?

Ok, so the conversation is not quite dead yet.  For now, if you need features that are only in Team Sites such as workflow or metadata, use a Team Site.  If you don’t care about Metadata and the document library in Office 365 Groups is good enough for you, then use that.  As you can see, Office 365 Groups is really starting to tie everything together.

This morning at the Future of SharePoint event, Microsoft announced the General Availability (GA) of SharePoint Server 2016.  While many customers have transitioned to the cloud with Office 365, on-premises SharePoint is still very real and alive.  As an end user, you might have looked at SharePoint Server 2016 and wondered “Why bother?”.   There really isn’t many new features that the end user is going to get excited about.  Yes, you’ll get a nice new suite bar, but most of the rest of the features the user cares about are linked to hybrid scenarios.

What you should take away from today’s event is that you are not upgrading to SharePoint Server 2016 for what you get today.  You are upgrading for what you get in the future.  Whereas previous versions of SharePoint Server were very much static and didn’t change over the three year release cycle, this version is very different.  This version lays the foundation for new features to be delivered through Feature Packs.  These Feature Packs will bring new features to your on-premises SharePoint farm without having to wait until the next big version.  Microsoft even plans on delivering a set of capabilities specific to the needs of on-premises customers.

Don’t worry, new features won’t just show up overnight like in Office 365.  Instead, as a SharePoint administrator, you’ll be able to have control over which features you enable in your farm.  This will give you time to plan and communicate change accordingly.  For whatever reason you run on-premises SharePoint, this should be an exciting announcement as it means you won’t get left in the cold waiting for the latest killer feature.  Does that mean, every feature from Office 365 is coming down to on-premises? No.  Some features simply aren’t feasible on-premises.  That’s why the hybrid story is so important.  However, it does mean, you’ll get updates on-premises faster than ever before.

Feature Packs will be delivered through the public update channel starting in 2017.  Microsoft will announce more details about the upcoming Feature Packs in the coming months.  To get the new Feature Pack, your company will have to have purchased SharePoint with Software Assurance.  For Enterprise customers, that’s probably most of you.  You’ll notice this is similar to the model that Windows 10 is using and the way it updates as well.

There is an exciting road ahead for SharePoint.  Be sure and read everything about it in case you missed any of it.

For years, enterprise have been spending huge amounts of money and time building their Intranet on top of the SharePoint platform.  Intranets take lots of planning and development even to get the most basic of functionality.  Throw in heavy branding and responsive design and you’re looking at a significant investment.  Launching a new Intranet has just been too long of a process with too many technical hurdles, but things are going to improve.

SharePoint team site and mobile app

Microsoft has announced a new page publishing experience that will make a lot of publishing scenarios much simpler.  It provides an updated page authoring canvas that allows for simple branding and page layouts while still having some extensibility hooks.  Best of all what you create here is responsive and works seamlessly with the new SharePoint app.  Out-of-the-box you will be able to quickly creates pages without a bunch of up-front configuration first.  Remember what you had to do before?  You know, create content types, page layouts, master pages, workflows, page libraries and more.  Not to mention, Microsoft has been telling you to stop customize master pages for some time now.  You want to go back to that?

SharePoint becomes a first class citizen in Office 365 – a few years ago, you might nave noticed that references to the actual term SharePoint were few and far between in Office 365.  The only real entry point to SharePoint was through the Sites link in the app launcher.  That’s changing.  The link will now say SharePoint in it and so will the navigation in the suite bar.  Clicking on the link will take you to the new entry point or SharePoint Home which pushes sites that you frequent right to the center.  It also tracks sites you are following as well as provides links to other sites.  This should make it easier to find many of the sites you need without an organization having to put a lot of thought into the information architecture.  While it won’t outright replace it.  It’s a great starting point for organizations who have never bothered to really set anything up like that.

SharePoint home page with activity - 100 percent

But my Intranet *MUST* do X or we can’t use it – great!  Keep doing what you are doing and customize the whole thing the way you used to.  However, if your requirements are flexible, the first release may be just what you need.  If you are looking for a simple page authoring canvas with little ramp-up, I think you are going to like it.  This upcoming release, I think will come close to hitting the “80%” mark where it’s good enough to get people publishing content quickly and easily.  If you have advanced needs and you find that you need something more, then you are probably going to have to go back to the conventional publishing model while you wait for new features to come online in future releases.

The Intranet, not just for huge Enterprises any more.  I have worked at a number of consulting companies and there is good money in helping clients build out elaborate Intranets.  Sure a lot of that comes down to the planning and design, but the implementation was just overly complex.  Just as Office 365 has brought features like Team Sites and Exchange into small organizations years ago, the new modern pages experience is making the Intranet broadly available to smaller organizations.  That’s pretty exciting.

SharePoint-the-mobile-and-intelligent-intranet-7-and-8

We are about to start a big Intranet project or are in the middle of one – This is a tricky place to be in and your organization will have to make decisions about timelines.  The new SharePoint Home entry will be here soon but the modern page publishing features are further out in 2016.  Although there is limited information right now.  Try and take a look at your requirements and see if the new Modern pages experience will meet your requirements.  If you don’t think it will, them continue implementing your new Intranet as usual and take another look at it in the future.  If you think it does meet your requirements, then maybe take a step back and see what happens and use this as an opportunity to fully vet out your define phase.  Ultimately, it comes down to your organization’s priorities, requirements, and timelines.

The future of SharePoint is bright.  Today has taught us that Microsoft is continuing to invest in the product as a core.  If you missed any of the announcements, be sure and read through them to find out everything that’s coming.

with 1 comment(s)
Filed under: ,

Azure App Service makes it easy to build a back-end for a mobile application.  One of my favorite features of it is Easy Tables.  Easy Tables makes it easy to create tables in a SQL Azure database and comes with an SDK for several mobile platforms that make it easy to access it.  You can make calls to it using REST, but there are APIs for Apache Cordova, Xamarin, as well as native development.  Some of the APIs even support offline synchronization.  If you are looking to prove out an idea, Easy Tables makes it well, easy.

There is some great documentation out there for getting started with Easy Tables as well.  However, I ran into a few stumbling points in my early experiences that I thought I would share.  These instructions assume, you have created an Azure App Service app.  If you don’t have one just look for App Services in your Azure portal and click New.  In our case, we are going to be working with a node.js back-end.

Setting up the database connection

Setting up the database connection should be simple, but unfortunately, it’s not due to issues in the configuration UI.  Before you go to set up the connection, you will need to create a SQL Server to host your database if you don’t have one already.  This isn’t terribly complicated just try to pick a region close to the one hosting your App Service App.  After your database is set up, you need to create the database to host your data.  You can use the free pricing tier too to try things out if you haven’t created a free database in this particular region yet (you only get one per region).

AppServiceSQLFreePricingTier

Once you have your database set up, you are ready to configure a connection to from Azure App Service.  Effectively, you are creating a connection string.  To set up the connection, go to your App Service App and click on the Easy Tables link under Mobile.

AppServiceEasyTableMenu

It will then initialize Easy Tables if needed and prompt you to click continue to set it up.

AppServiceEasyTablesPromptToConfigure

Here is where, it will prompt you to connect to a database.

AppServiceEasyTablesStep1 

After this, it will prompt you to select your existing SQL Server and then database.  The final step is to enter in the username and password you set on your SQL Server and database.  Specify a connection string name (whatever you want) along with the credentials.  Click ok and hope for the best.  I mentioned I had problems with the UI right?  The first time I did this, I couldn’t get it to save my password because it had a semicolon (;) in it.  Remember, how I said it is building a connection string underneath?  Semicolon is a delimiter for connection strings and the UI doesn’t handle that at all.  It just fails.  Hopefully this step works for you.  If it doesn’t there is a way that you can manually set the connection string using Resource Explorer.  That’s more detail than I want to go into today.  If you run into it though, feel free to ping me or leave me a comment and I’ll provide the details.

Once your database connection has been set up, click on the Initialize App button.  This will destroy any files that may already exist on your App Service App so don’t click it if there is something you need there.  Effectively this sets up folders such as /tables and /api as well as some of the basic node.js configuration.  It also creates a test table called TodoItem.

Creating new tables

There are a few ways to create new tables but not all of them have the desired results:

  • Create the appropriate files in the tables folder
  • Create the table through the Azure Portal in the Easy Tables section
  • Manually create the table in SQL Server

I would say the preferred way to create the table is by creating the appropriate files in the tables folder.  The node.js back-end will create new SQL table for any .js file you add to the tables folder (assuming the appropriate supporting files are present).  For example if you create a file named Event.js, you will get a SQL table called Event.  If you create a file named Contact.js, you will get a SQL table called Contact.  You get the idea.  There is a little bit more to it though.  Let’s look at the steps involved. 

First, you need to be able to edit the files of your node.js App Service app.   You have several ways to do this.  I recommend setting up a local git repository and pushing files up to your App Service App.  You can configure that along with credentials in the Deployment source setting of your App Service App.  You can find more about setting up the node.js back-end and deployment in this article.  However, you can also just edit the files directly from Easy Tables.  If you already have a table created.  Click on the table and then click the Edit script button to edit the files directly in Visual Studio Online.

AppServiceEasyTableEditScript

Here you can edit the files directly and they are saved immediately.  I am going to start by create a table to store event information named Event.  That means we need to create a file named event.js.  Here is what the starting file looks like.

var azureMobileApps = require('azure-mobile-apps');
 
var table = azureMobileApps.table();
 
module.exports = table;

According to the documentation that is all that is required to get started with a new table.  Now, you might be wondering where are the column names and types?  Technically, they aren’t required.  You see Easy Tables will create new columns on the fly when you make your first insert.  This is great to try things out but not really what you want to do in a production environment.  So I like to specify my own columns.  You can use types such as string, date, boolean, and number.  More complex types aren’t supported by the API.  To create your columns put them in an object and assign to the columns property of table.  Then be sure and set dynamicSchema to false so that it won’t create new columns on the fly.  Set these values before calling module.exports.

table.columns = {
	"title": "string",
	"start_date": "date",
	"end_date": "date",
	"description": "string",
	"image_url": "string",
	"cost": "string",
	"event_type": "string",
	"send_notification": "boolean"
};
 
table.dynamicSchema = false;

Don’t worry about creating columns for things like and Id, created and modified dates, or even soft delete.  Easy Tables will create those columns for you automatically.

Before Easy Tables goes and creates your back-end table, there are a few more steps.  First, you need a corresponding .json file.  In our case it would be Event.json.  This contains some basic properties such as if soft-delete is enabled and whether certain operations require authentication.  I found no documentation whatsoever that said this file was required.  However, in the TodoItem samples out there on git hub the file was always there.  Here is what it looks like.

{   "softDelete" : true,   "autoIncrement": false,   "read": {     "access": "anonymous"   },   "insert": {     "access": "anonymous"   },   "update": {     "access": "anonymous"   },   "delete": {     "access": "anonymous"   },   "undelete": {     "access": "anonymous"   }
}

If you want to authentication to be required, you can change the access for the corresponding operation to “authenticated”.  However, we’ll cover that in a different post.

At this point, you should see an entry for your new table in the Easy Tables section of the Azure portal.  However, the underlying SQL table does not exist yet.  In fact, it doesn’t get created until you make your first API call to it.  That part took me a while to figure out.  Maybe there is a way to automate it getting created but in my experience it doesn’t happen until then.  There is actually a parameter called seed that will let you put sample data into your table, but I have never successfully gotten it to work.

Calling the API using JavaScript

In my example, I am using Apache Cordova.  It’s easy to get started with the Cordova SDK and Azure App Service.  Just add the plugin and create a client object and start querying.  Take a look at this post for more details.  If you are using straight HTML / JavaScript, the code is basically the same as well.  First, create a client object using the URL to your App Service.  You can get this from the Azure Portal.

var client = new WindowsAzure.MobileServiceClient(azureMobileClientUrl);

Now, we can call getTable with the table name to get the event.  Selecting data from the table is easy, just call .read() to get everything.  It has a promise attached so that you can work with your results and errors accordingly.

var table = client.getTable('event');
return table.read()
.then(function (events) {
	// do something with events
}, function (error) {
	// do somethign with errors
});

If you want to filter your data just add a where clause.  Each item you add to the collection will be treated as an “AND” operator.  Note, that it is only simple equal comparisons though.

var table = client.getTable('event');
return table
	.where({ id: myId})
	.read()
.then(function (events) {
	// do something with events
}, function (error) {
	// do somethign with errors
});

You can also use .orderBy() to control the order of the data as well.  It can be used in conjunction with the where clause if desired.

var table = client.getTable('event');
return table
	.orderBy('start_date')
	.read()
.then(function (events) {
	// do something with events
}, function (error) {
	// do somethign with errors
});

Making any of the above calls is enough to get your table created.  You can then go to Server Explorer –> Azure –> SQL Databases and verify that the table was created.  This is a great way to look and see what data is there as well.

Have a look at the SDK reference for inserting and updating data.  You simply need to create an option with the matching column names and specify the values and call .insert() or .update() accordingly.  Remember you don’t need to specify values for the Id or any of the other fields Easy Table creates.  The response you get back will be the inserted or updated data.

var table = client.getTable('event');
return table.insert({
	title: 'Event Title',
	description: 'This is an event description',
	start_date: new Date(),
	end_date: new Date(),
	event_type: 'Special Event',
	send_notification: true
})
	.then(function (response) {
	}, function (error) {
	});

If you need to delete data, the one thing I ran into is that it only works with the Id field.  If you try to delete based on some other value, you’ll get an error.  If you need to delete based on some other field, you will need to create your own API.  We’ll cover that in another post.

var table = client.getTable('event');
table.del({ id: id })
.then(function () {
	// deleted
}, function (error) {
	// error
});

Mobile Apps in Azure App Service is an evolution of Azure Mobile Services.  I often find there is more documentation on that since it’s been around longer.  For example, take a look at this article on working with mobile data as it has a lot more examples.

Summary

Azure App Service Easy Tables make it super easy to get started creating a back-end for your mobile app.  Give them a try and see what you can create.  If you are looking for some samples, be sure and check out the Azure Mobile Apps repository in GitHub.

with no comments
Filed under: , ,

Office 365 Connectors for Groups provide a way to pull in information from various sources such as Twitter, GitHub, UserVoice, and Trello.  Office 365 Connectors deliver external content straight to the inbox of your Office 365 Group.  Last fall, I talked about how to use Office 365 Connectors while they were in preview.  However, now they are available for customers with tenants in First Release. 

Now, there are over 50 Office 365 Connectors available.  Many for services I have never heard of (maybe you have).  Here is the complete list (as of now):

  • Aha! – Manage work with a visual roadmap.
  • Airbrake – Captures errors and aggregates the results for developer review.
  • Alerts by MarketSpace – Company monitoring for marketing, PR, development and investment teams.
  • AppSignal – Track throughput, response times, and error rates in your apps.
  • Asana – Task and Project Management.
  • BeanStalk – A code hosting workflow
  • Bing News – Search news for your work and personalize email update.
  • BitBucket – Manage and collaborate on your code projects.
  • Biztera – Biztera simplifies decision-making with cloud software that streamlines approval workflows
  • BMC TrueSight Pulse (Boundary) – Monitor cloud and server infrastructure with real-time visibility.
  • Brandfolder – Gain great controls of your brand assets, all from Office 365.
  • Bugsnap – Track issues for your web and mobile apps.
  • Buildkite – Smart automation for your software development processes
  • Caller Zen – Create an SMS call center in seconds.
  • Chatra – Modern Live Chat Software.
  • CircleCl – Build, test, and deploy software continuously.
  • Clearlogin – Integrate Clearlogin with your GroupMail to receive real time notifications about use activity.
  • Cloud 66 – Build, deploy, and manage applications.
  • Codeship – Automate the workflow for development and deployment.
  • Crashlytics – Track errors in mobile apps.
  • Datadog – for Dynamics CRM
  • Delighted – The fastest way to gather actionable feedback from your customers.
  • Dynamics CRM Online – Manager your customer sales, marketing, and service relationships.
  • Enchant – Provide customer support with chat and email.
  • Envoy – The new standard for visitor registration
  • GhostInspector – Build or record automated testes for your web site.
  • GitHub – Manage and collaborate on code projects.
  • GoSquared – Simple, yet powerful analytics for your online business
  • Groove – Provide customer support with tem collaboration.
  • HelpScout – Provide customer support through email messages.
  • Heroku – Build and run applications in the cloud.
  • HoneyBadger – Exception and uptime monitoring for your web apps
  • Incoming Webhook – Send data from a service to your Office 365 group in real time (this is how you can bring in your own data to Office 365 groups)
  • IQBoxy – Intelligent Expense Management with Real-Time Receipt Processing and Expense Reports
  • JIRA – Gather, organize, and assign issues detected in your software.
  • JustReply – Simple time tracking for teams.
  • Librato – Real-Time Cloud Monitoring
  • Logentries – Search and monitor log data from any environment.
  • Magnum Cl – Build, test, and deploy software continuously.
  • MailChimp – Email marketing service
  • MeisterTask – An intuitive task management and collaboration tool for agile projects
  • OpsGenie – Alert and Notification solution
  • PagerDuty – Track and manage incidents, and define escalation policies.
  • Papertrail – Track and manage incidents and downtime issues.
  • Pingdom – Track uptime/downtime and performance of web sites.
  • Pivotal Tracker – Track progress in agile projects.
  • Raygun – Monitor crashes in web and mobile apps.
  • RSS – Get RSS feeds for your group.
  • Runscope – Log, monitor, and measure your API usage.
  • Salesforce – Manage sales opportunities
  • Sentry – Capture and aggregate exceptions in code.
  • Stack Overflow – Follow tagged questions and provide answers.
  • StatusPage.io – Statues page for App or Website
  • Subversion – An open-source revision control system
  • TestFairy – Test mobile apps and retrace events that precipitate errors.
  • Travis Cl – Run tests and deploy apps.
  • Trello – Manage to-do lists and tasks all in one place.
  • Twitter – Received messages called Tweets.
  • UserLike – Provide live chat support for web sites and mobile apps.
  • UserVoice – Product management and customer support tool.
  • WakaTime – Get updates about your team’s daily coding activity from GitHub, GitLab, or Bitbucket.
  • Wunderlist – Organize and share your to-do lists.
  • Yo – Communicate with others in simple fashion.
  • Zendesk – Zendesk brings companies and their customers closer together.
  • ZootRock – Curated Content in your niche to share to your Groups.

As you can see that is quite a list of Office 365 Connectors.  Are there any in that list that you will find useful?  Is there anything you wish was on the list?

with no comments
Filed under: ,

In my days, I have seen a lot of RFPs, RFIs, and RFQs.  I’ve seen RFPs that are simple, complex, small, large, strict, well-executed, and no-so-well-executed. In my experience, there are a lot of things you can do to ensure you get the best possible response.  If you are considering issuing a Request for Proposal (RFP) for your next project, consider the following before issuing your next one.

1) Set realistic dates

When you issue an RFP, it takes a heap of time for a consulting firm to mobilize a team, get the right people on the call, and prepare a response.  The more complicated your RFP response requirements are the longer it takes to prepare a response.  I’ve received RFPs which want a turn-around in less than a week.  When this happens, often one or more of the bidding firms will push back on you or simply decline to respond.  This often causes multiple extensions to the RFP deadline which honestly doesn’t make your company look good.  When your RFP process is not organized, it makes me question whether we want to do business with your company.  If you can’t run an RFP without a lot of issues, you may not be able to successfully run a project initiative or pay your invoices on time.

Also pay attention to the dates you set.  Setting an RFP to be due the day after Christmas or during the Thanksgiving break is not cool.  You don’t want to work on the holidays. Your respondants don’t either.

2) Don’t ask for references in your RFP response

When applying for a job, I am not going to give you a list of references before I have even talked to someone at your company.  When responding to an RFP, it’s not any different.  It’s not that we don’t want you to vet us out and know our qualifications.  You have to understand references are difficult.  You are asking for us to ask our previous clients to take time out of their day and talk to you about a project on our behalf.  Chances are you aren’t the only RFP we are responding to at a given time either.  That means that client could be receiving multiple calls.  Think about it.  If you select my firm and we successfully implement your project, are you willing to be a reference for any number of random callers?  I doubt it.  Reference calls need to be scheduled and prepared for in advance.  I can’t just have you calling them blindly.

Instead of asking for references, ask for qualifications.  If you have questions about the qualification ask the respondant to talk about it when you down-select vendors and go to orals.  At that point, it might be acceptable to ask to set up a reference call.

3) Don’t require answers to questions that aren’t relevant

You know what makes vendors question if they even want to respond to your RFP?  Required responses to a bunch of questions that aren’t relevant.  I once had an RFP for an Office 365 implementation ask questions such as “Does your product included printed manuals?”.  My response: “We plan on documenting your implementation.  We can print if off for you if you would like.” In the question period, I even asked about these irrelevant questions and the company insisted they all should be completed.  Make sure you are asking the right questions.

Don’t ask too many questions either.  Keep in mind, that every vendor you pull in is going to mobilize a team of people to put together a response.  This likely includes time from the account manager, a vice president or two, an architect, and maybe an offshore team.  It’s not uncommon for the team to spend several hundred hours combined in their response.  For smaller companies that often means using resources such as the architect that are billable.  For that person they either are billing less or working overtime

4) Respond to questions in a timely manner

Potential vendors ask you questions to clarify their understanding of your needs.  Your answers are often crtiical to building their response.  If you don’t reply to answers until two days before the RFP is due, that is going to strain the respondants.  It also means you might not be getting the best response possible out of your bidders because they didn’t have adequate time to prepare it.  Try to get your responses back at leaast a week before the RFP is due.

5) Stop asking for fixed-bid

So the project you are working on is risky with a lot of unknowns?  Great, let’s slap a fixed-bid requirements on to the RFP.  That way if there is an issue, you can blame the consultants! 

Do you not realize what happens when you do this?  You are automatically paying 20% more at the minimum.  The winning firm is going to do everything they can to lock in scope and assumptions so they don’t end up losing money on the deal.  Not to mention, that there is rarely enough detail in terms of requirements in the RFP itself to make an accurate estimation.

If for whatever reason the consultant can deliver on the fixed bid, eventually you are going to hit a point where you are straining the relationship with your consulting firm.  This is when talks of lawyers come in and then neither of you want to do business with each other ever again.

Your best bet is to fixed bid a scoping engagement to properly map out the requirements and technical design.  From there, you can get a more accurate estimate on the implementation and you will likely end up paying less.

6) Don’t include too many vendors

Keep the number of vendors down to a minimum.  Keep in mind, you are asking a lot of people to jump through a lot of hoops at every firm you contact.  You’re just creating more responses that your RFP team has to read through and rank.  They will all start to look the same after a while too.  Definitely, don’t let one more vendor in because some sales rep got wind late in the process that you were having an RFP.

7) Stop asking for active and past litigation

Are the lawyers at your company going to provide a list of all law suits you have ever been involved in to someone random?  Why do you expect us to?  Companies get sued all the time, especially the larger ones.  I don’t see many firms providing you this information.  If you really want the details, you can go dig it up.

8) Don’t be so picky about the response format

I think it’s ok to ask respondants to limit the length of their respose to X amount of pages.  It’s not cool to have phrases like “adding a column to this spreadsheet is grounds for being kicked out of the RFP process”.  I understand you have to review multiple RFPs and you are trying to keep things consistent but specifying which font or a maximum file size of an exhibit is just silly. 

9) Don’t ask for hard copies

I’m looking at you government and health care companies.  In this day and age asking for a hard copy to be delivered in person or by courier is just silly.  Why don’t you just have us chisel it out the response in stone tablets?  Let us deliver the response electronically via e-mail or through and RFP response portal.  If you just want to print it out because you want to scribble notes on the paper, may I introudce you to the Surface Pro 4.  You can use digital ink to mark up or highlight the response as needed.  If you really do need paper copies, print it out yourself as needed.  Are you just trying to save on printing costs?  If you are that might be a bad sign you don’t need to do this project. 

10) Don’t issue an RFP if you are just going to pick the encombant

Again, be mindful of all of the time you are using of people.  RFPs often require sizable teams and late hours to meet the deadline.  If you know you are going to pick the encombant before even starting the RFP process, that is absolutely bad form.  There is a special place in hell for companies that issue an RFP and then just pick the vendor they already had.  Work out whatever issue you had with your vendor and just go with them and skip the RFP process.  If it is procurement pushing you to issue an RFP so that you will get the “best deal”, time to get a new procurement department. :)  At least, let the respondants know that there is an encombant at play.

Bonus Tip – Don’t ask for names of resources

Don’t ask for the names of resources that will be staffed in your RFP response.  Do you really think we have an entire project team sitting around on the bench just waiting to be staffed on your project if we happen to win it?  Our company wouldn’t be in business long if we did.  Keep in mind you aren’t the only RFP we are responding to at a given time.  If you really thought we had resoures lying around at all times multiply that by the number of active RFPs and that’s a lot of people not bringing in revenue.  Nothing shows that your company has no idea how the consulting industry works more by asking for the names of resources in an RFP response.  I can almost assure you that whomever we list as a name won’t be the person you get at the start of an engagement.  If you want to know the background of the people you are staffing wait until you select a vendor and then ask for a profile.

Do you really need to do an RFP?

You have smart people at your company, but maybe you just don’t have enough of them.  That’s why you are issuing an RFP right?  Maybe you are looking for a certain skillset you don’t currently have?  Before opening up an RFP for your next project, ask if you really should.  Your smart people should already have established relationships with a handful of vendors.  Call some of them, tell them what you are trying to accomplish and just ask for a proposal.  Setting up an entire RFP process is long and overly complicated.  If you already have an established relationship with a few vendors, why shop it out?  If they have done good work in the past, they probably will in the future as well.  If anything you know what you are dealing with.  Do you really want to go with a different vendor for every project just so you can ensure you get the best price?  I understand there are other reasonse to issue an RFP, but you really have to ask yourselfif it’s really worth it?

with 1 comment(s)
Filed under:

Office 365 Connectors are a new extensibility point for Groups.  They provide a way to pull in information from various sources such as Twitter, Bing News, and Trello.  The information they provide will be dumped right into your group’s inbox.

Office 365 Connectors for Groups are new.  Really new.  So new in fact that you have to enable them with a special query string parameter, EnableConnectorDevPreview=true.  The easiest way to enable them is to go the URL below.

https://outlook.office.com/owa/#path=/mail&EnableConnectorDevPreview=true

When you go to the URL below, you’ll see a new option for Connectors under the … menu of your group.

GroupsConnectorsMenu

There, you will see a list of all connectors currently available.  This includes things like Bing News, Twitter, JIRA, RSS, Trello, Github, and a few others.

GroupsConnectorList

Since we are looking to stalk celebrities, we’ll make use of the Twitter and Bing News connectors.  I want to know any time, the celebrities I am stalking send a tweet out.  When they do tweet something, what they say will automatically be delivered to my Group’s inbox.  If you are subscribed to the Group the will end up in your inbox as well.

Let’s add the Twitter connector.  The first thing you will need to do is add an account if you haven’t provided one yet.  Sign-in with your credentials to proceed.

GroupsConnectorTwitterLogin

Now, let’s configure the celebrities we want to stalk.  By adding their twitter handles (minus the @) separated by commas.  You can only enter up to 50 characters in the box, so if you have lots of people to stalk, you can just add another instance of the Twitter connector.

GroupsConnectorTwitterAdd

You can also Track hashtags this way as well as mentions and retweets.  It’s fairly versatile.  When you are done, click the Save button.  When you click it, it will give you no response and just wait there for a few seconds.  Remember this is a developer preview.  If you click it multiple times, you will end up getting multiple connectors, so just be patient. 

Now, I want to see whatever they say in the news about my celebrities so I am going to add a Bing News Connector.  This will give you a digest once a day of whatever terms you ask it to search on.  Here I have used the Bing News Connector to send me the latest on Neil Patrick Harris.

GroupsConnectorBingAdd

Once you are done adding connectors, you can see the ones you added on the Connectors page.  You can reconfigure them here as needed as well. 

GroupsConnectorsConfigured

When you return to your group inbox, there will be notifications that your connectors are active.

GroupsConnectorConfigured

Now, you just wait until your celebrities make their move (and for the connectors to fire).  When I first configured them last week, they didn’t work.  Remember, we are in a developer preview.  They are started working yesterday and the data started coming in bulk (and so did the notifications).  In my experience today, the twitter connector usually delivers tweets within 5 minutes of when it occurred.  The Bing News connector delivers a digest whenever it feels like.  I’m not really sure what time zone it is executing in but I assume it’s the local one.

After your connectors have started providing data, here is what it looks like.  For example, here is the twitter connector providing Wil Wheaton’s latest tweet.  You’ll get one entry in the inbox for each tweet.  The formatting of URLs, usernames, and hashtags could use some improvement, but all of the information is there.

GroupsConnectorTwitterResult

Here is what the results look like in Outlook 2016.

GroupsConnectorResultOutlook

It’s up to the connector to provide the formatting, so you can see the results from Bing look a bit different.

GroupsConnectorResultBing

All kidding aside, I hope you can see the power of Office 365 Connectors.  Instead of stalking celebrities, you could use this same technique to quickly provide information about company’s competitors right to your inbox.  There is also a developer story using Webhooks.  The last I checked the URL it provided me didn’t work yet, but we can expect it is coming soon.  Office 365 Connectors are powerful and I am looking forward to using them more.

I hope you found this look at Office 365 Connectors useful.  Happy stalking!

Stalk me on twitter: @coreyroth.

with 4 comment(s)
Filed under: ,

Looking back to the SharePoint 2013 launch, drag and drop into a document library was one of the hit features.  As many of us have transitioned to Office 365, it’s a feature we expect to be there in our document libraries and OneDrive.  However, when Windows 10 was released at the end of July, the feature was noticeably missing from Microsoft’s newest browser, Edge.  Drag and drop support didn’t work in Office 365, OneDrive, or anywhere else for that matter.

With the Threshold 2 (TH2) fall update for Windows 10 last week, we now have drag and drop support.  For Windows Insiders, this support has been there in preview builds for a while but it hasn’t been talked about much.

W10TH2EdgeDragDrop

Say what you want about Edge, I have had pretty good luck with it.  Some people act like it is completely unusable, but I can get most of my day to day tasks done with it.  I’ve used it to replace Google Chrome for almost all of my tasks.  Does it still have issues and does it still need more features?  Absolutely, but it gets the job done for the most part.  If you have discounted it before, give it another try.  It’s constantly being updated.

One cool feature that came with the fall update is the ability to cast media to a device directly from Edge.  What this means is that when you visit a page with a video on it, you can select cast to device in the menu and choose any Miracast or DLNA supported device to view that content on the remote screen. 

If you’re not familiar with Miracast, it lets your wireless transmit what’s on your screen to a TV or monitor.  This is included in a lot of devices now such as the Roku.  You can also pick up a Microsoft Wireless Display Adapter as well.  What’s nice is that you can cast a video to a TV from your Windows 10 device and you can still use it while the video is showing.  With Miracast support before, you had to project your entire desktop or extend it just like you did with an external monitor.

W10TH2EdgeCastToDevice

I think it’s a useful feature.  Similar to what Chromecast has been doing for some time now.  However, Miracast is a more open standard and supported across a variety of devices.  The nice thing is that it doesn’t require any additional drivers, plugins, or software to make it work.

with 3 comment(s)
Filed under: ,

Today Microsoft released the IT Preview of SharePoint 2016.  We’re going to look at the install process today and point out any differences between previous versions of SharePoint.  You can find out more about SharePoint 2016 from the Office blog post.

Installation

I created a new virtual machine running on Windows 10 Hyper-V.  This virtual machine is running Windows Server 2016 Technical Preview 3.  On this server, I have promoted it to a domain controller using Active Directory Directory Services.  I have also installed SQL Server 2014 R2.

Installation of SharePoint 2016 IT Preview looks similar to previous SharePoint installations.  When you mount the ISO, you will see a familiar splash screen.

SP2016P1Splash

Installing the prerequisites

The prerequisite installer looks similar to previous versions.  Click Next to continue.

SP2016P1Prerequisites1

Accept the license terms.

SP2016PrerequisitesLicenseTerms2

Wait and home it finishes successfully.

SP2016P1PrerequisitesProgress3

Installation of the SharePoint 2016 IT Preview prerequisites is just as troublesome as previous versions.  I managed to generate a few errors and never did get it to agree the role was successfully installed.  I had to install the Visual Studio 2013 Redistributable myself.  There is a work-around for getting around the IIS role configuration step on the Known Issues in SharePoint Server 2016 Preview page.

SP2016P1PrerequisitesError4

SharePoint 2016 Installation

The installation is quick and easy.  It also looks similar to other versions.  Start by entering the product key.  Remember this was found on the SharePoint 2016 download page.

Sp2016P1InstallProductKey1

Next, accept some more license terms.

SP2016P1LicenseTerms2

Specify the install location for your search index.  You can just use the defaults for this preview installation.

SP2016P1InstallFileLocations3

Wait for the installation to complete.

SP2016P1InstallProgress4

When installation completes, you will be prompted to run the Configuration Wizard.

SP2016P1InstallCompleteRunWizard5

Running the Configuration Wizard

The Configuration Wizard also looks similar but has a few changes.  Click Next to proceed.

Sp2016P1ConfigStart1

Now, create a new farm.

SP2016P1ConfigNewFarm2

Then, specify your farm passphrase.

SP2016P1ConfigPassphrase

Specify your farm account, SQL Server name and database.

SP2016P1COnfigDatabase3

This new screen allows you use the new MinRole feature. MinRole simplifies the server architecture of SharePoint 2016.  In this case, we are going to use a Single Server Farm (which shouldn’t be as bad as previous versions).

Sp2016P1ConfigRole4

Specify the details for Central Administration.

SP2016P1ConfigCA5

Confirm your settings.

Sp2016P1ConfigConfirmation6

When it finishes, you are ready to try out Central Administration.

Sp2016P1ConfigComplete6

Now, we can start our configuration.

SP2016P1CAOptIn

That’s a quick look at the installation process of SharePoint 2016 IT Preview.  Be on the look out for my next posts covering changes in Central Administration and the UI.

Follow me on twitter: @coreyroth

More Posts « Previous page - Next page »