Category Archives: Development

Working with Apollo CLI

I’ve been exploring the Apollo stack for developing with GraphQL and find the documentation a bit outdated so I decided to make some notes for myself and start collecting them here. The first thing I wanted to do is experiment with the apollo client codegen for TypeScript and understand how this tool works and leverage it for creating a TypeScript Apollo client. I started by using this Starwars sample Apollo server so I could focus on the client-side code gen which was quick and easy to stand up.

$ git clone https://github.com/apollographql/starwars-server.git
...
$ cd starwars-server
$ yarn && yarn start
yarn run v1.15.2
$ nodemon ./server.js --exec babel-node
[nodemon] 1.19.0
[nodemon] to restart at any time, enter `rs`
[nodemon] watching: *.*
[nodemon] starting `babel-node ./server.js`
🚀 Server ready at http://localhost:8080/graphql
🚀 Subscriptions ready at ws://localhost:8080/websocket

Next, I tested a simple GraphQL query to make sure the server is working by browsing here:

http://localhost:8080/graphql

I installed the Apollo CLI and started experimenting with codegen. Unfortunately, as of this writing the CLI documentation is outdated and refers to apollo-codegen and the parameters and configuration appear to have changed. To play with newer apollo CLI and client-side codegen I created a new “project” folder and just wanted to get some code generated without any other project dependencies/files etc. So, I created a folder to get started:

$ mkdir starwars-client
$ cd starwars-client

Next, I ran the apollo CLI to download the server’s schema with –endpoint parameter pointing to the running instance of the starwars-server sample:

$ ➜  starwars-client apollo client:download-schema --endpoint=http://localhost:8080/graphql
⚠️  It looks like there are 0 files associated with this Apollo Project. This may be because you don't have any files yet, or your includes/excludes fields are configured incorrectly, and Apollo can't find your files. For help configuring Apollo projects, see this guide: https://bit.ly/2ByILPj
  ✔ Loading Apollo Project
  ✔ Saving schema to schema.json
$ ls
schema.json
$

As you can see, this created a schema.json file containing details from my starwars-server. The next step is generating TypeScript code for a single GraphQL query using the downloaded schema. For good measure I’ll include a few of the issues I ran into along the way as I didn’t fine a lot on Google related to the various error messages.

➜  starwars-client apollo client:codegen
 ›   Error: Missing required flag:
 ›     --target TARGET  Type of code generator to use (swift | typescript | flow | scala)
 ›   See more help with --help

Ok, so I’m missing –target, that’s easy enough to add…

➜  starwars-client apollo client:codegen --target typescript
Error: No schema provider was created, because the project type was unable to be resolved from your config. Please add either a client or service config. For more information, please refer to https://bit.ly/2ByILPj
    at Object.schemaProviderFromConfig (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/node_modules/apollo-language-server/lib/providers/schema/index.js:29:11)
    at new GraphQLProject (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/node_modules/apollo-language-server/lib/project/base.js:31:40)
    at new GraphQLClientProject (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/node_modules/apollo-language-server/lib/project/client.js:33:9)
    at Generate.createService (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/lib/Command.js:114:28)
    at Generate.init (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/lib/Command.js:37:14)
➜  starwars-client

Again, unfortunately the bitly short link provided by the tool points back to the outdated apollo-codegen documentation which is inaccurate. So I added –localSchemaFile pointing to my newly downloaded schema.json:

➜  starwars-client apollo client:codegen --localSchemaFile=schema.json --target=typescript
⚠️  It looks like there are 0 files associated with this Apollo Project. This may be because you don't have any files yet, or your includes/excludes fields are configured incorrectly, and Apollo can't find your files. For help configuring Apollo projects, see this guide: https://bit.ly/2ByILPj
  ✔ Loading Apollo Project
  ✖ Generating query files with 'typescript' target
    → No operations or fragments found to generate code for.
Error: No operations or fragments found to generate code for.
    at write (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/lib/commands/client/codegen.js:61:39)
    at Task.task (~/.nvm/versions/node/v10.15.3/lib/node_modules/apollo/lib/commands/client/codegen.js:86:46)
➜  starwars-client

What this error is actually saying is that the tool is expecting to find either .graphql or .ts files that have GraphQL “operations” aka queries, or mutations defined within my project folder which I haven’t created yet. Turns out there are a few options, 1) create .ts files with gql constants or 2) create a .graphql file(s) that contain named queries. I started with a simple query.graphql file for testing like this:

query {
  heros(episode: NEWHOPE) {
    name
  }
}

I then ran the command again:

➜  starwars-client apollo client:codegen --localSchemaFile=schema.json --target=typescript

…and this yielded the same error as above because the CLI defaults to looking in ./src although you can change this using the –includes parameter. So I created the folder, moved the query.graphql file and re-ran the tool:

➜  starwars-client apollo client:codegen --localSchemaFile=schema.json --target=typescript
  ✔ Loading Apollo Project
  ✖ Generating query files with 'typescript' target
    → Apollo does not support anonymous operations
GraphQLError: Apollo does not support anonymous operations

Basically, this is telling me didn’t “name” the query so back to editing the query.graphql file and adding “heros”:

query heros {
  hero(episode: NEWHOPE) {
    name
  }
}

Ok, now let’s try that again:

➜  starwars-client apollo client:codegen --localSchemaFile=schema.json --target=typescript
  ✔ Loading Apollo Project
  ✔ Generating query files with 'typescript' target - wrote 2 files

Success! I now have a few new folders and files added to my “project”:

➜  starwars-client tree
.
|____schema.json
|______generated__
| |____globalTypes.ts
|____src
| |____query.graphql
| |______generated__
| | |____heros.ts

Btw, here’s an example .ts file with a gql constant declared that would also work to generate code:

const heros = gql`
query heros {
  hero(episode: NEWHOPE) {
    name
  }
}`;

In the above example I use command-line options although the apollo CLI supports a config file which looks like the following located in apollo.config.js which points to a remote schema from my starwars-server instance:

module.exports = {
    client: {
        service: {
            url: "http://localhost:8080/graphql"
        }
    }
}

Using the config file you can change the command-line as follows for pulling the schema:

➜  starwars-client apollo client:download-schema --config=apollo.client.js
  ✔ Loading Apollo Project
  ✔ Saving schema to schema.json

Next, I’ll start making use of the generated code and save that for another post.

Plotting Weekly Mobile Retention from the Localytics API using R and ggplot2

Part of building mobile web apps is understanding the myriad of mobile analytics and in part visualizing the data to shed light on trends that my otherwise be difficult to see in tabular data or even a colorful cohort table. I’ve been building a dashboard using R, RStudio, Shiny, and Shiny Dashboards aggregating data from MSSQL, Postgres, Google Analytics, and Localytics.

Within the Product section of the dashboard I’ve included a retention chart and found some great articles at R-Blogger like this one. The retention data comes from the Localytics API which I discussed previously though getting the data into the proper format took a few steps. Let’s start with the data, here’s an example of the REST response from Localytics looks like for a weekly retention cohort:

{
"results": [
{
"birth_week": "2014-09-08",
"users": 1,
"week": "2014-12-29"
},
{
"birth_week": "2014-09-29",
"users": 1640,
"week": "2014-12-29"
},
{
"birth_week": "2014-10-06",
"users": 2988,
"week": "2014-12-29"
},
{
"birth_week": "2014-10-13",
"users": 4747,
"week": "2014-12-29"
},
{
"birth_week": "2014-10-20",
"users": 2443,
"week": "2014-12-29"
},
…

Below is the main function to fetch the Localytics sample data and convert it into a data frame that’s suitable for plotting. Now, admittedly I’m not an R expert so there may well be better ways to slice this JSON response but this is a fairly straight forward approach. Essentially, this fetches the data, converts it from JSON to an R object, extracts the weeks, preallocates a matrix and then iterates over the data filling the matrix to build a data frame.

retentionDF <- function() {
  # Example data from: http://docs.localytics.com/dev/query-api.html#query-api-example-users-by-week-and-birth_week
  localyticsExampleJSON <- getURL('https://gist.githubusercontent.com/strefethen/180efcc1ecda6a02b1351418e95d0a29/raw/1ad93c22488e48b5e62b017dc5428765c5c3ba0f/localyticsexampledata.json')
  cohort <- fromJSON(localyticsExampleJSON)
  weeks <- unique(cohort$results$week)
  numweeks <- length(weeks)
  # Take the JSON response and convert it to a retention matrix (all numeric for easy conversion to a dataframe) like so:
  # Weekly.Cohort Users Week.1
  # 1    2014-12-29  7187   4558
  # 2    2015-01-05  5066     NA
  i <- 1
  # Create a matrix big enough to hold all of the data
  m <- matrix(nrow=numweeks, ncol=numweeks + 1)
  for (week in weeks) {
    # Get data for all weeks of this cohort
    d <- cohort$results[cohort$results$birth_week==week,][,2]
    lencohort <- length(d)
    for (n in 1:lencohort) {
      # Skip the first column using "+ 1" below which will be Weekly.Cohort (date)
      m[i,n + 1] <- d[n]
    }
    i <- i + 1
  }
  # Convert matrix to a dataframe
  df <- as.data.frame(m)
  # Set values of the first column to the cohort dates
  df$V1 <- weeks
  # Set the column names accordingly
  colnames(df) <- c("Weekly.Cohort", "Users", paste0("Week.", rep(1:(numweeks-1))))
  return(df)
}

To make things easy I put together a gist and if you’re using R you can runGist it yourself. It requires several other packages so be sure to check the sources in case you’re missing any.
Fair warning the Localytics API demo has very limited data so the chart, let’s just say simplistic however given many weeks worth of data it will fill out nicely (see example below).

> library(shiny)
> runGist("180efcc1ecda6a02b1351418e95d0a29")
Localytics Retention Plot Example
Retention Chart

Could Facebook have scaled using MSSQL instead of MySQL?

The issue of scaling has been discussed recently at work and particularly in light of some of the recent details that have come out from Facebook on their data centers.

The question I want to know is do you think Facebook have scaled having been built on MSSQL as compared to MySQL? Did access to the source play an important role? Should it still be called MySQL or has FB changed it sufficiently that it’s really no longer MySQL?

Discuss.

Leveraging the Facebook API on Google AppEngine with jQuery Mobile

In the past I’ve built starter kits for Facebook development in ASP.NET and building CruiseControl plugins. My latest interest has been experimenting with Google AppEngine as my day job is all python appserver stuff so it’s a pretty logical fit. I’m slowly putting together all of the pieces I’d like to have in website starter kit including support for jQuery Mobile and Facebook Graph API.

I have a simple proof-of-concept app working here. Btw, as this is a work-in-progress it YMMV and the app may or may not be in a working state so apologies in advance.

Technologies used:

Google AppEngine

Cheetah Templates

Facebook Graph API

jQuery Mobile

Pretty printing a Python dictionary to HTML

Here’s a routine I wrote to pretty print a Python dict into an HTML table and though I’d share.

    def prettyTable(dictionary, cssClass=''):
        ''' pretty prints a dictionary into an HTML table(s) '''
        if isinstance(dictionary, str):
            return '<td>' + dictionary + '</td>'
        s = ['<table ']
        if cssClass != '':
            s.append('class="%s"' % (cssClass))
        s.append('>\n')
        for key, value in dictionary.iteritems():
            s.append('<tr>\n  <td valign="top"><strong>%s</strong></td>\n' % str(key))
            if isinstance(value, dict):
                if key == 'picture' or key == 'icon':
                    s.append('  <td valign="top"><img src="%s"></td>\n' % Page.prettyTable(value, cssClass))
                else:
                    s.append('  <td valign="top">%s</td>\n' % Page.prettyTable(value, cssClass))
            elif isinstance(value, list):
                s.append("<td><table>")
                for i in value:
                    s.append('<tr><td valign="top">%s</td></tr>\n' % Page.prettyTable(i, cssClass))
                s.append('</table>')
            else:
                if key == 'picture' or key == 'icon':
                    s.append('  <td valign="top"><img src="%s"></td>\n' % value)
                else:
                    s.append('  <td valign="top">%s</td>\n' % value)
            s.append('</tr>\n')
s.append('</table>') return '\n'.join(s)

Setting up debugging for Google AppEngine projects in Eclipse

When I made the move from Windows to OSX and Python development one of the things I wanted to experiment with has been Google’s AppEngine. I installed the SDK and setup the plugin for Eclipse but ran into a few issues I wanted to make note of since I think other could probably benefit from it as well. I’ll mention I’m on OSX 10.6.8 using Eclipse for Java Version Helios SR 2.
Continue reading Setting up debugging for Google AppEngine projects in Eclipse

ASP.NET Facebook Starter Kit updated to VS.NET 2010

Recently, I received an email via my blog asking for a VS.NET 2010 version of my Facebook Starter Kit. Since I no longer work on Windows it hasn’t been a high priority so I offered instructions on how to extract the project from the .vsi file and load the project manually. Much to my (pleasant) surprise I got a follow-up email with an updated starter kit for VS.NET 2010 updated to .NET v4.

If anyone decides they want to upgrade the FDT to the latest version I’d be happy to post an update for that as well. The starter kit is available on Facebook is nearing 5600 users! An interesting side note, I’m no longer able to edit the properties of this application on Facebook because it contains the word “facebook” in the URL which is no longer allowed.

Enjoy!

Setting up a new SVN repository

Since I rarely tend to setup new $g(SVN source repositories) I find myself googling for the syntax of the various commands so I’m writing this post to myself for future reference. I’m starting a new website project at work and needed a new repository created with a few external folder references to another project. After creating a bootstrap set of directories and files from pieces of another project then needed to create the new repository. My folder structure looks like this:

~/work/publishing <- location of new project files/folders

cd ~/work
svn import -m "Initial revision" publishing https://svn/repos/publishing/trunk

The project has two folders which will come from an existing project that won’t be modified in this new project so I’m using svn:externals to reference those folders. I’m not sure this was the best way to do it but since I have several other people who will be using the repository I decided to check it out locally:

cd ~/work
svn co https://svn/repos/publishing pub

Next, I changed to the subfolder where the two external folders are located:

cd ~/work/pub/trunk/application

Then using svn propedit I created two external links though the first time through I got it wrong. Note, the quotes are important here:

SteveT:application strefethen$ svn propset svn:externals 'model https://svn/repost/proj/trunk/application/lib' .
property 'svn:externals' set on '.'
SteveT:application strefethen$ svn up
svn: warning: Error handling externals definition for 'model':
svn: warning: OPTIONS of 'https://svn/repost/proj/trunk/application/model': 200 OK (https://svn)
At revision 24431.

The problem in this case was my mistyping “repos” adding a “t” (bolded above). Next, I made the mistake of using propset twice for two different externals (model and lib) which yielded the following when I tried to update:

SteveT:application strefethen$ svn up
svn: warning: Error handling externals definition for 'model':
svn: warning: URL 'https://web.merchantcircle.com/repos/proj/trunk/application/model' at revision 24431 doesn't exist
At revision 24431.

At this point, I needed to use svn propedit to create two external folder references but I didn’t have any editor set to make these changes which yielded:

SteveT:application strefethen$ svn propedit svn:externals .svn: None of the environment variables SVN_EDITOR, VISUAL or EDITOR are set, and no 'editor-cmd' run-time configuration option was found. I edited my .bash_profile adding an SVN_EDITOR environment variable
mate ~/.bash_profile

Adding:

export SVN_EDITOR="mate -w"

Save an close TextMate. Then source my .bash_profile to get the changes:

source ~/.bash_profile

Now I could finish editing the svn:externals as follows:

svn propedit svn:externals .

The lib external was already defined from my previous propset command so I just needed to the model external making my svn:externals look like this:

lib https://svn/repos/proj/trunk/application/lib
model https://svn/repos/proj/trunk/application/model

Now when I update both external folders are updated:

SteveT:application strefethen$ svn up
Fetching external item into 'model'
External at revision 24431.
Fetching external item into 'lib'
External at revision 24431.
At revision 24431.

Finally, I committed my changes to the application folder for these externals. 

SteveT:application strefethen$ svn commit
Log message unchanged or not specified
(a)bort, (c)ontinue, (e)dit:
c
Sending        application
Committed revision 24432.

Not sure this is going to be helpful for anyone else but I can certainly make use of it in the future.