Matthew Garvin

UX Research, Design, Strategy

IBM Developer UX Strategy

IBM Developer, previously developerWorks, is the rebranded platform IBM created for developers to use and explore various resources, from coding patterns to live-stream tutorials and articles from experts.


In Winter 2020, I was a member of a three-person cross-functional team of user experience consultants, infORMed design, that was contracted by IBM to obtain an outside perspective on an ongoing in-house study for IBM Developer. All three members of the team rotated roles per study to take advantage of our particular strengths. For example, given my background in anthropology, I led user interviews and usability testing. Another team member is a survey expert at Pew Research, and thus took the lead in survey design. While our third member is a business strategist at Deloitte, and took the lead on competitive analysis. Even so, all three of us had equal responsibilities over the duration of the project.

IBM Developer is the premiere resource on the internet for code patterns, tutorials, and cloud development. IBM Developer bills itself as a one-stop shop for open source code, in-depth learning tools, and support from IBM’s expert developer advocates (IBM Developer, 2020).

Rebranded in the past year from the long running developerWorks, IBM Developer is currently involved in fixing issues as they come in through user feedback. Currently there is no benchmarking for information architecture. Research surrounding “Getting Started” is bare bones, and the client has expressed a need to improve the platform’s ability to draw in quality developers and build long term brand loyalty (Hicks & White, 2020).

Among our goals for this project are to better understand user needs and provide insights related to key performance indicators (KPI’s), such as using IBM instructional content (tutorials, code patterns, etc.) as a bridge to drive people to use IBM Cloud products; provide insights to increase engagement through an improved login and account management experience; and compare the IBM Developer community related content and engagement with developerWorks legacy community and recommend what, if anything, should be reintroduced and/or optimized (Hicks & White, 2020).

With these goals in mind, and working closely with Kevin and Kayla, our client contacts who run UX Research for the platform, we screened a variety of participants and scheduled three remote user interviews. Despite the minute sample size, this study yielded rich data related to our goals. For example, we learned that login and account management are a barrier to entry for at least one participant.

We used our interviews to formulate hypotheses that we then tested via a survey, designed and deployed using Qualtrics. Our team followed the survey up with a comparative analysis of seven products selected as representative competitors. We did this as an important step in better understanding how these products are similar and where they differ. This data was also used to refine our initial recommendations in improving growth and retention on the IBM Developer platform.

We conducted heuristic evaluations using Jakob Nielsen’s ten fundamental usability heuristics (Nielsen & Mack, 1994) to evaluate information architecture through menu structure and to gain insights into how users experience “Technology Topics”.

Finally, we ran a round of usability testing (n=5). We used Calendly to recruit test participants who regarded themselves as professional developers and who had a relative familiarity with usability testing. We then scheduled and conducted usability tests over Google Meet. We recorded the sessions and had participants share their screens for analysis. During each session, we asked participants to complete six task, propelling them to explore Cloud, Linux, Blockchain, AI & Watson, and Node.js.

We completed each usability test with a post-test questionnaire to help us develop findings, and a survey that the client designed for their research.

Methodological Overview

Interaction Map

Shortly after meeting with our client, we got to work on generating an interaction map. The purpose was to produce a static representation of the system in which we were beginning to research. For this task, we used Figma and the plugin Arrow Auto. To browse the complete interaction map, click here. We simply collected screenshots focused on cataloging Technology Topics, Community, Account Creation, and IBM Cloud. Then we arranged and connected them with the plugin.

With the interaction map in hand, we now had a reference artifact which made salient all the possible actions and all the respective error and non-error states (e.g. pages, screens) that those actions can lead to. We didn’t need to produce a map of the entire website, but rather a more narrow representation of the primary workflows and navigation avenues in which we were concerned.

image of the interaction map

Semi-Structured Interviews

Target Population

IBM Developer is an expansive resource for developers, both seasoned as well as those interested in entering the field. For our purposes, we specifically targeted professional developers already using or interested in using cloud computing for development purposes.

Recruiting Methods

Our client supplied the team with the names and email addresses of 36 users that had opted in to be contacted to participate in user research for IBM Developer. We sent a screener survey to these individuals to get a brief understanding of their background in development.

After reviewing the screener results, we selected three individuals to participate in our interviews that indicated they were available for up to 1.5 hours at a mutually agreeable time for at least 2 members of our team. Of the candidates that fit this profile, we selected a range a user segmentation.


  • Female, “gray hair”, Java Developer
  • Female, early 20’s, recent college graduate, career in tech and information
  • Male, mid-30’s, self-employed developer in custom software solutions and web development

Tools and Instruments

Each team member led a semi-structured remote interview using a protocol designed to:

  • probe the participant for values, attitudes, goals and behaviors related to software development
  • gather concrete experiences using developer platforms and resources
  • learn how their experiences varied across that spectrum.

Each interviewer was accompanied by another member of the team filling the role of the note-taker. Interviews were conducted virtually via Google Meet, and the interviews were audio recorded using our mobile devices.


The audio recordings were transcribed first by Otter, then manually edited for accuracy. Using virtual sticky notes on Miro, a digital whiteboard, we utilized a hybridization between thematic analysis and affinity wall diagramming by coding significant insights/quotes into clustered themes/sub themes related to values, mental models, goals, behaviors, pain points, and tasks. We also noted a few others themes that emerged such as roles, skill level, preferred tools and demographics.

image of our hybridized thematic affinity diagram

From this data we were able to craft three provisional, or “ad-hoc” personas and design a scenario for each based on the themes that emerged from this analysis.

Findings & Recommendations

  • IBMid is a big pain point
    • Recommend simplifying
  • Participating in a positive developer community is a value
    • Recommend increasing engagement through targeting and courting underrepresented communities
  • Searching for answers can be confusing
    • Recommend research for improved information architecture

Competitive Analysis

Seven products were selected as representative competitors for comparison and sorted into a taxonomical hierarchy of direct, indirect, partial, parallel, and analogous categories. With categories established, our team then compared these products across several dimensions, analyzing this information to produce insights in understanding how these products are similar, where they differ, and how this data can be used to refine our initial recommendations, to improve growth and retention on the site.

Through this evaluation, our team was able to investigate in more detail the general landscape in which IBM Developer and its public cloud platform compete. We found that all direct competitors display for new users some variation of “Getting Started” from the homepage except IBM Developer. While a lot of comparative research exists on public cloud offerings, there was little comparison regarding the actual websites, particularly since the launch of the new website last fall. According to our scoring of common usability issues, IBM Developer stacks up quite nicely with the competition.

Lastly, we compared cross-platform engagement within the cloud developer community and uncovered relevant insights about the importance of a healthy and stimulating developer community.

Online storefront
Cloud hosting
Complex pricing
Cloud hosting
Software deployment
Similar service and market presence
Targeting a different region
Active and engaged
Supportive atmosphere
Google Developer
SAP Developer
RamNodeAlibaba CloudWomen Who Code
table outlining competitors
image of comparative usability score table

Finding & Recommendations

  • The overall usability score average was 12.25, IBM Developer scored 11
    • Recommend investing in innovative information architecture
  • IBM does not have a “Getting Started.”
    • Recommend prioritizing “Getting Started” from the landing page
  • A lack of engaged community support
    • Recommend bringing back some community features to re-establish engagement


For this leg of our research, we composed and deployed a survey through Qualtrics seeking an ideal number of 50 respondents but a minimum of 30 to reach statistical significance.

Target Population

IBM Developer is an expansive resource for seasoned developers and those interested in entering the field. We targeted professional developers already using or interested in using cloud computing for the storage and deployment of software.


Our client indicated that we should first reach out to the 36 users we had previously contacted for user interviews. These are professional developers of varying focus areas and skill sets. IBM granted us two $50 and five $20 gift cards to raffle off as an incentive to take the survey. After the first round of respondents, the client provided us with 141 additional emails to recruit from. Due to the low response rate, the client sent the survey invitation to the same 141. Two days later, our client sent an additional 90+ potential respondents the survey invitation.


  • About 64% of respondents identify as male, 22% female, and 14% other. n=39
  • Approximately 41% of respondents have less than 5 years of experience in professional software development; about 16% have 30+years of experience. n=50
  • 16 of 50 respondents use IBM Developer mainly to learn about technology topics; 9 have never used IBM Developer. n=50
  • About 44% of respondents use IBM Developer once a month or less. n=39

Tools and Instruments

We used a Google Doc for remote brainstorming and shared with the client in order to solicit feedback regarding survey question design. We then transferred the survey into Qualtrics, a powerful software for survey design and deployment on the web. Each member of the team as well as the client piloted to survey to check for errors and ensure a good question flow for participants.


Through our analysis of survey data, we learned that respondents primarily use IBM Developer to take advantage of learning materials in the Technology Topics section.

While content interactivity isn’t a high priority to many of our respondents, companies like AWS and Adobe have both shown how content interactivity can be leveraged to improve community and user engagement, spurring retention and brand loyalty, where AWS holds interactive coding sessions on Twitch and Adobe has built up an entire community around its Daily Creative Challenges for XD, Photoshop, and Illustrator.

Out of the 5 features, we asked respondents to rank in terms of importance, website navigation was ranked the highest. After Technology Topics, respondents said they were most satisfied with navigation; this in contrast to earlier research where navigation was often noted as an area of improvement.

Findings & Recommendations

  • Exploring “Technology Topics” is generally a positive experience
  • Content interactivity can undergo improvements
    • Recommend creating Daily Cloud Developer challenges
  • Website navigation is the most important feature that users look for
    • Recommend a card sorting study to better understand end user’s mental models

Heuristic Evaluation

For the heuristic evaluation, our goal was to evaluate information architecture through menu structure and to gain insights into how users experience “Technology Topics”. To this end, we utilized Jakob Nielsen’s 10 fundamental usability heuristics.

After conducting our individual evaluations, we came together to align and synthesize violations and severity ratings through which we derived a few key findings:

  • Information architecture presents opportunities for revision
  • Navigation cues are lacking
  • Search functions miss opportunities to pin relevant information
  • Help and support are static and simple

As a result, we make the following recommendations:

  • Revamp header menu navigation
  • Introduce cues to support system status visibility
  • Search results could be both optimized and tailored
  • Site map creation and more dynamic, specific Q&A features

These recommendations are intended to support the project’s overarching goals, which are to bring fresh ideas to the new site, improve conversion rates for turning site visitors into paying customers and brand loyalists, and build a more engaging community.

Usability Testing

We completed each usability test with a post-test questionnaire to help us develop findings and a survey the client designed for their parallel testing.

We used Calendly to recruit usability test participants who considered themselves professional developers and had a relative familiarity with usability testing. These individuals came from diverse levels of development experience, age, and gender. We then scheduled remote meetings with our 5 participants via Google Meet, which allowed us to record the session for analysis and allowed participants to share their screen plus their webcam during the test so that we could correlate facial expressions with what they were looking at on screen. During each session, we asked participants to complete six tasks, compelling them to explore Cloud, Linux, Blockchain, AI & Watson, and Node.js.

These usability tests yielded the following key findings:

  • Visited links don’t provide feedback that they’ve been visited
  • Internal names are confusing to users
  • Tutorials, series, and courses are great content, well-liked by users, but lack a clear learning pathway

And the following recommendations were made in response to our findings:

  • Change the color of visited links to provide feedback to the user as to where they’ve been
  • Simplify naming conventions, supply users with an acronym decoder
  • Implement gamification on learning pathways to improve user retention


Several limitations impacted our research. Primarily, COVID-19 restrictions caused us to move the project completely virtual in the middle of the project. While the project was intended to be fully remote from the start, this inadvertently led to our team having to move our collaboration and meetings completely online. Additionally, one of our team members contracted COVID-19 following an international trip right before we made the move to lockdown, which further hindered our ability to formally wrap up the project.

The client, IBM Developer, was very supportive and understanding throughout this ordeal and expressed a desire to let us off the hook, so to speak, in terms of a final deliverable. Our findings and recommendations are solid, but impact cannot be fully articulated here.


In conclusion, our recommendations address our key findings.

We produced an interaction map as a static representation of visual information architecture and interaction patterns. We created 3 provisional personas based on semi-structured user interviews. Following interviews, we conducted a comparative evaluation on multiple tiers of competitors spanning an array of dimensions involving features, interaction patterns, and business models which impact user experience. Next, we deployed a survey to gather data to better understand demographics and uncover relevant information in the context of IBM Developer’s: “Getting Started,” community engagement, and customer retention. To evaluate information architecture and to gain insights into how users experience “Technology Topics,” we conducted a heuristic analysis. Finally, we facilitated five usability tests to get feedback and derive insights on how usable the live site was for real users.

Mtthwgrvn Avatar

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at