Archive for Computing

Week 7 – Web Test Whack-a-Mole

This week was largely spent revising some existing web performance tests, and expanding their coverage on a particular business application.  At this point, I feel pretty confident in my ability to create quality component-based and data-driven web tests that can be used for both acceptance and performance testing.

I have begun studying more about Windows Communication Foundation testing.  WCF is frequently used for service-oriented applications, which seems to be the way the industry is moving at the moment.  I will continue working more on WCF testing next week.  I also had some exposure to writing SQL scripts for the purpose of data generation.

Sadly, this was the last week for my co-intern Kunal.  Though he may have left the building, his spirit lives on within the team… as does his image, pasted to the resident office mummy and mounted to the back of Lunk:

Kunal's presence lives on, especially for Lunk!

Kunal’s presence lives on, especially for Lunk!

This week was national “Crack a Pack” day for Magic the Gathering, and the office celebrated with a literal mountain of booster packs and some pretty cool swag:

A literal swag bag.

A literal swag bag.

Week 6 – Loads of Load Testing

I focused on the troubleshooting & performance analysis sides of my internship this week.  I worked with another performance engineer to do some load test analysis and completed introductory work on performance monitoring.  I also spent a good deal of time shadowing my supervisor to learn some system troubleshooting and build processes.  It’s amazing how valuable good log files can be… a lesson I’ll be taking forward into my own programming projects.

Continuing along with my database research from last week, this week I learned about MDX, a query language for OLAP databases.  OLAP cubes are n-dimensional (as opposed to relational databases, which are 2-d) and are often used in analytics and business intelligence.  At a high-concept level, MDX queries essentially slice a geographic region from a hypercube– this is a similar technique to clustering methods I learned in Data Mining last quarter, and is really cool stuff if you geek out on information theory like I do.

I also experimented with some of Visual Studio 2012’s new features, played around with the ASP.Net profiler, and made further revisions to my previous suite of web performance tests which will make them more useful in load testing.

Oh, and I went undefeated for my first four matches in the employee Magic league– not bad for an intern!

Week 5 – Databasics

Most of this week was spent doing independent research into database topics.  As Evergreen does not regularly offer database coursework, I was grateful for this opportunity.  I focused my studies on SQL concepts and ADO.NET, which is a C# framework for interacting with relational databases.  I ended up writing a small C# console program that uses ADO.NET to interface with a local database instance, performing basic tasks such as selecting/inserting/deleting rows and calling stored procedures.  This was a great exercise and really gave me a solid basic foundation for coded database interaction.  Soon, I will begin applying those concepts to writing some C# code related to gathering database performance metrics.

I also completed a number of other small projects.  I finished my research into Silverlight web testing and wrote up a document presenting my findings.  I also fixed a couple of issues with the suite of web tests I had written previously, and have begun writing an internal troubleshooting document for web performance tests.

I had a moment of excitement when the tests I wrote turned up what looked like a bug, but it turned out to be a known issue… so the search continues for my career-first bugfind.

Culture-wise, I picked up my cards for the employee Magic: the Gathering league and made my deck.  I will be playing approximately 15 games over the next five weeks against various coworkers; this will not only be fun, but be a great chance to network.

Week 4 – Swag Bag Fever

I spent Monday and Tuesday making a third pass at my suite of web performance tests for an internal database tool.  This time around, I wrote up a design document which detailed my proposed changes to the existing codebase and cleared it with my supervisor, to ensure that I wouldn’t overlook anything major again.

I am fairly happy with the results.  I ended up with a package of standard tests which run through various functionalities of the tool, looking for errors and unacceptable response times.  These will be executed nightly, so that if any daily development changes induce a bug, it will be caught right away rather than later (when it is much more expensive to fix). These standard tests are mostly component driven– I created a couple dozen components and updated existing components to be dynamic and data driven, whilst preserving compatability with legacy code.  These components each execute simple tasks such as logging in, looking up a certain piece of data, making changes to various fields, and so on… the idea is that they are modular building blocks that can be used to create further tests.

I checked in my new tests to the source control and they are now being executed nightly, although there is still a bug with one of the tests that Ernie is going to take a look at.  As you can tell, though, I am learning quite a bit about web performance test automation!

On Wednesday, I spent the afternoon shadowing Ernie as he troubleshooted some technical issues.  Witnessing him winnowing down the various possibilities was a great chance for me to get a sample of exactly what a Senior SDET does.  I also started reading a book on Service Oriented Architecture (SOA), in order to broaden my understanding of the relationships between the various software components the company uses.  This will be invaluable moving forward in my career, as so many companies either already use SOA or are moving towards it.

I ended my week researching the topic of web test creation for Microsoft Silverlight applications.  Silverlight is a web application framework with capabilities analogous to Flash or Java applets, and which uses an encoded binary body messaging protocol called msbin1. Unfortunately, official support for Silverlight is weak, and there is no built-in way to edit msbin1 messages in Visual Studio.  So in order to make a quality & dynamic web performance test, you need some kind of third-party plugin to handle decoding & encoding of msbin1 objects.

At first, I tried to write my own plugin by following a tutorial on MSDN, but it didn’t really work that well.  I did some research and found a third-party solution that I am pretty happy with, and presented it to my supervisor.  The only problem with this third-party plugin is that documentation is very poor, so I have begun drafting a document which explains it’s use and capabilities.  I will also create a few sample tests to demonstrate the plugin’s capabilities.

Friday, I came in to see a bag full of goodies sitting in my office chair.  It was chock-full of Magic cards, Kaijudo cards, deck boxes, and a D&D soda cozy.

WotC Swag

A few of the goodies from the swag bag.

On the company culture side of things, I attended the employee pre-release party for Magic 2014.  A giant booster-draft card tournament was held… I ended up winning two of my four matches, which I was relatively happy with, given how good at the game many of the employees there are.  I also met several different people from around the company and learned about what they do.  Oh, and pizza was provided… it seemed like there were around a hundred pizzas, although that could have been my imagination.

I walked away with a ton of booster packs, exclusive promotional cards, and good memories.  Being an intern does pay!

Week 3 – Lessons Learned

This week was a half-week due to the 4th of July holiday.

So, I learned some tough real-world software engineering lessons this week.  I submitted the test suite I wrote last week for a code review and it turned out that a bunch of test components which I had assumed were unused and superfluous were actually being referenced by the tests for another software tool.  Whoops!  So some of the new tests that I wrote are now duplicate in functionality to the old existing code, which is a major no-no, as twice as much code means twice as much cost to maintain.  Yeah, that was a bit embarrassing…  I will be refactoring my test suite next week to fix all of that.

Lesson learned: NEVER assume a piece of code isn’t an external dependency elsewhere.

I also began studying some more performance-related concepts.  I worked my way through the book “Advanced Web Metrics with Google Analytics” by Brian Clifton. Google Analytics is pretty much the current industry standard for website traffic analysis and provides a wealth of data which can be very useful in creating accurate load & stress tests.

Another lesson learned: never go on vacation, lest your cubicle be Justin Biebered:

Biebered!

This is what happens if you use your vacation time at WotC.

That’s all for this week.  I look forward to transitioning beyond scripted web testing and into performance testing.

Week 2 – Web Performance Fever

I spent this week cutting my teeth on test automation by revising a set of web performance acceptance tests for an internal database tool. Web performance tests are basically coded scripts that run against a web service, validating that certain functionalities are present and that response times are within the defined standard. Besides just revising and updating the existing test suite, I also created several new tests for features that were not already within the test coverage.

I spent a good chunk of time defining a set of reusable components for basic things like page navigation, so that creating future web performance tests tests is somewhat streamlined. By the time I was done with these, I had learned quite a bit about HTTP and data extraction from web services. I now feel confident in my own ability to create a set of meaningful automated tests of a web page/web service.

I also took some time to explore creating my own sample WCF (Windows Communication Foundation) web service and client application. This was just an ASPX webpage interface that let a user browse and edit an underlying SQL database. It was a pretty basic program but gave me a deeper understanding of what was going on under the hood of the web services I’ve been working on.

Other than all that, I also met with my supervisor and received a high-level overview of performance engineering. Now that I have learned basic web performance testing, I will move on to learning about creating and analyzing load, stress, stability, failover, and endurance tests– whew! It’s going to be a busy summer.

Oh, and a couple personal highlights– the office ergonomics expert switched out my old chair for a giant, awesome new chair that looks like a CEO should be sitting in it. Definite upgrade!  Also, the intern who sits next to me had been out sick all week, and I came in one day to find this:

You would think WotC, of all places, would keep a cleric on staff to remove curses.

You would think WotC, of all places, would keep a cleric on staff to remove curses.

A couple of times, I could swear I saw its hand move out of the corner of my eye…

Looking forward to diving into more testing next week!

Week 1 – Getting My Feet Wet

I began my internship this week.  I had about a week beforehand to prepare, so I crammed as much as I could on materials covering software testing and the C# programming language.

I attended my new hire orientation and met the team I’ll be working with.  Everyone has been very welcoming so far.  The offices are fantastic; being a company that makes games, there is art and sculpture everywhere.  The cubicles are all decked out with decorations and overall it feels very comfortable as far as office buildings go.  And there is a giant dragon in the lobby!

Justin and Mitzy
Me and Mitzy, the resident lobbydragon.

As far as actual work goes, my first week was mostly spent oriented myself with their software and culture.  Ernie (my supervisor) supports several different teams around the office, so it seems like I’ll get exposed to a variety of projects and methodologies.  On his advice, I am working my way through a book called “Professional Application Lifecycle Management with Visual Studio 2010”.

Techniques I learned this week included initial exploration of Visual Studio’s suite of test tools.  I also wrote some man pages about various testing tools for the internal wiki; this was a good exercise that helped acquaint me with the tools and their functionality.  I attended daily standup meetings with the Software QA team and will continue to do so.

Culturewise, I attended a couple of company gaming sessions including a confidential playtest for an unreleased product.  It was a blast!

Overall, it has been a very positive experience thus far.  More to come!

Internship – Performance Engineering

Performance Engineering is an internship designed to expose the student to real-world experience as a Software Development Engineer in Test (SDET).  The student will work alongside an experienced SDET at Wizards of the Coast (a subsidiary of Hasbro) to build new performance test automation, maintain existing automation, execute performance tests and analyze test results.

This blog will chronicle my experiences as a Performance Engineering intern at Wizards of the Coast.  Having been a WotC fan for many years, this internship is in many ways a dream come true.  I will be working under the mentorship of WotC’s Ernie Nelson (Senior SDET) and Evergreen’s Sheryl Shulman (Professor of Computer Science).

Stay tuned for weekly summaries of what I’m learning, challenges I’ve encountered, and workplace experiences in general!

Project Summary: Natural Language Processing With Prolog

For my “Computing Practice and Theory” project, I chose to continue some of the work with Prolog and EBNF grammar parsing that I had begun in the “Computability” program.  Specifically, I wanted to study more about Natural Language Understanding– the process of gleaning intended meaning from natural language.

As Natural Language Processing can be a bit dry, I decided to put a fun spin on the project; I designed and implemented a small command grammar for a Scribbler II robot.  The Scribbler II is a simple educational robot which has a python-based interface library called Myro available.

Scribbler with Fluke

My program, tentatively titled “ScribPro”, is a hybrid python/prolog application that allows control of a Scribbler via natural language.  All communication with the robot is transmitted over Bluetooth.  The python component handles initialization, messaging to & from Prolog, code execution, and both fetching and sanitization of user input.  I was learning python on the fly as I did this project, so the final python component is pretty messy… but by the end of the quarter I felt that I have achieved basic competency.  I am very glad that I learned it; the ability to use python quickly create simple programs or prototype has already proven itself invaluable.

The Prolog component handles the bulk of the parsing work.  It reads the sentence fed to it from the python component, determines whether or not it is a successful parse, and if it is, converts the parse tree into a functor which it passes back to python.  The validity of a parse is determined according to a grammar.  Early on in the project, I tried to write a very abstract grammar using only the rules of the English language, and quickly realized that it was beyond the scope of this project.  This dilemma is one that anyone trying to do natural language parsing inevitably encounters: English is a messy language!  In order to overcome this and make a usable program within a ten-week timeframe, I chose to severely bound the words and sentential forms that my program would recognize.  Limiting the scope of my grammar to natural language units that are applicable to robot control made the project feasible.

While the final grammar never grew quite as complex as I would’ve liked, I am pretty happy with how the project turned out.  The robot is able to parse & execute a variety of movement-based commands — from plain English!

Not only is the robot able to handle simple sentences such as “Please go forward”, it can handle fairly complex compound sentences such as:

  • “Spin in a circle to the left two times.”
  • “Go forward four point five feet and turn left, then wait nine hundred milliseconds, turn around and beep once.”
  • “Scoot forward until you reach the wall, then move backwards two inches, turn around and beep.”

Beyond the basic text mode, there is an optional voice input mode that taps into the Microsoft Speech Engine.  It is pretty fun to vocally order the robot around!

The fact that I was using two programming languages in tandem (prolog & python) presented some significant initial difficulties in the project.  A good portion of my final code revolves around passing messages back and forth between the two.  The tradeoff was worth it, however– had I been forced to implement my program in a single language, it would have been quite a bit messier.  There is something to be said for using the right tool for the job.  By the end of the project, I gained a substantial amount of confidence in my own ability to multiple languages within a single project.  No doubt this will come in handy in the future.

Here is a diagram of the program flow and component design of my final program:

Program Flow

I now feel confident that I have a handle on the basics of natural language parsing using grammars, and believe that I could implement such a system again in a much shorter timeframe, with a more elegant end result.

If I were to revise this project, I would attempt to abstract the code-generation side to be a level higher.  Right now, the code generated from a successful parse is extremely Scribbler-specific… which I would like to rectify.  I could conceivably create my own generic set of common robotics-related functions which the parser would invoke, rather than having it call specific Myro functions.  I would then implement the ability to map these generic functions to model-specific ones via config files; essentially allowing the parser to be used for other types of robots with ease.

Overall, I had quite a bit of fun with this project, whilst still managing to feel like I challenged myself.  I look forward to further independent research into Prolog, as I feel that logic programming is a valuable yet underutilized programming paradigm.  I also look forward to further tinkering with more advanced robotics projects.

Past weekly blog updates can be found at the following links:

My final code can be found here:

  • scribpro.py – Python Main Program Executable
  • nlp.pl – Prolog Natural Language Parsing Component

(Please be forgiving of the code, as I am an undergrad and was learning by doing for most of it.  There are approximately a million things I would change if I could do it all over again.)

If anyone reading this has any questions or is working on a related project, I’d love to hear from you!

Week 9: Cleaning House (And Code)

This week, I began wrapping up my project.  Only one week remains until this thing must be worthy of demonstration in front of a class full of peers, so adding new features has to be de-emphasized in favor of reinforcing stability & adding documentation to the project.

Major New Features:

  • Added the ability to give a conditional statement.  For now, this is limited to wall sensing.  So now, “go forward until you hit a wall” is a legit command.
  • Added the ability to repeat a command a specified number of times.  i.e. “Turn left ninety degrees twice” or “Beep six times.”

Quality of Life Improvements:

  • I reorganized quite a bit of the python code and added docstrings to all the functions.
  • I added the ability to toggle voice/text input/output modes as well as debug mode.  Previously, these were command-line options or hardcoded.
  • Added punctuation filtering to user text input in order to avoid ugly crashes.  Should be a lot more user-proof now.
  • I added some more documentation to my prolog code.
  • Stabilized the photo-taking code.  It still crashes in IDLE, but works great via console python.
  • MANY small grammar reinforcements & bug fixes.

The last major feature I hope to add is a “help” menu, which will give the user an overview of commands and some examples of usage.

In the meantime, I am beginning to brainstorm about my in-class presentation and am also working on outlining my final project paper.

Next week I will post all of the sources for the final program along with a project summary.  In the meantime, here is a writeup of the full command grammar that the robot accepts:

Read more