IBM Developer is the rebranded platform (previously known as developerWorks) for developers to use and explore various resources, from coding patterns to live-stream tutorials and articles from experts. IBM Developer also offers community-related outlets for developers to meet and engage via live-streaming, blog, podcasts, and newsletters.
The primary goals of this project are to make recommendations to attract and retain more active users on the website and understand the aspects of community that are important to developers. These goals drove our research.
Shortly after meeting with our client, we got to work on generating an interaction map. The purpose was to produce a static representation of the system in which we were beginning to research. For this task, we used Figma and the plugin Arrow Auto. To browse the complete interaction map, click here. We simply collected screenshots focused on cataloging Technology Topics, Community, Account Creation, and IBM Cloud. Then we arranged and connected them with the plugin.
With the interaction map in hand, we now had a reference artifact which made salient all the possible actions and all the respective error and non-error states (e.g. pages, screens) that those actions can lead to. We didn’t need to produce a map of the entire website, but rather a more narrow representation of the primary workflows and navigation avenues in which we were concerned.
IBM Developer is an expansive resource for developers, both seasoned as well as those interested in entering the field. For our purposes, we specifically targeted professional developers already using or interested in using cloud computing for development purposes.
Our client supplied the team with the names and email addresses of 36 users that had opted in to be contacted to participate in user research for IBM Developer. We sent a screener survey to these individuals to get a brief understanding of their background in development.
After reviewing the screener results, we selected three individuals to participate in our interviews that indicated they were available for up to 1.5 hours at a mutually agreeable time for at least 2 members of our team. Of the candidates that fit this profile, we selected a range a user segmentation.
- Female, “gray hair”, Java Developer
- Female, early 20’s, recent college graduate, career in tech and information
- Male, mid-30’s, self-employed developer in custom software solutions and web development
Tools and Instruments
Each team member led a semi-structured remote interview using a protocol designed to:
- probe the participant for values, attitudes, goals and behaviors related to software development
- gather concrete experiences using developer platforms and resources
- learn how their experiences varied across that spectrum.
Each interviewer was accompanied by another member of the team filling the role of the note-taker. Interviews were conducted virtually via Google Meet, and the interviews were audio recorded using our mobile devices.
The audio recordings were transcribed first by Otter, then manually edited for accuracy. Using virtual sticky notes on Miro, a digital whiteboard, we utilized a hybridization between thematic analysis and affinity wall diagramming by coding significant insights/quotes into clustered themes/sub themes related to values, mental models, goals, behaviors, pain points, and tasks. We also noted a few others themes that emerged such as roles, skill level, preferred tools and demographics.
From this data we were able to craft three provisional, or “ad-hoc” personas and design a scenario for each based on the themes that emerged from this analysis.
Findings & Recommendations
- IBMid is a big pain point
- Recommend simplifying
- Participating in a positive developer community is a value
- Recommend increasing engagement through targeting and courting underrepresented communities
- Searching for answers can be confusing
- Recommend research for improved information architecture
Seven products were selected as representative competitors for comparison and sorted into a taxonomical hierarchy of direct, indirect, partial, parallel, and analogous categories. With categories established, our team then compared these products across several dimensions, analyzing this information to produce insights in understanding how these products are similar, where they differ, and how this data can be used to refine our initial recommendations, to improve growth and retention on the site.
Through this evaluation, our team was able to investigate in more detail the general landscape in which IBM Developer and its public cloud platform compete. We found that all direct competitors display for new users some variation of “Getting Started” from the homepage except IBM Developer. While a lot of comparative research exists on public cloud offerings, there was little comparison regarding the actual websites, particularly since the launch of the new website last fall. According to our scoring of common usability issues, IBM Developer stacks up quite nicely with the competition.
Lastly, we compared cross-platform engagement within the cloud developer community and uncovered relevant insights about the importance of a healthy and stimulating developer community.
|Similar service and market presence|
Targeting a different region
|Active and engaged|
|RamNode||Alibaba Cloud||Women Who Code|
Finding & Recommendations
- The overall usability score average was 12.25, IBM Developer scored 11
- Recommend investing in innovative information architecture
- IBM does not have a “Getting Started.”
- Recommend prioritizing “Getting Started” from the landing page
- A lack of engaged community support
- Recommend bringing back some community features to re-establish engagement
For this leg of our research, we composed and deployed a survey through Qualtrics seeking an ideal number of 50 respondents but a minimum of 30 to reach statistical significance.
IBM Developer is an expansive resource for seasoned developers and those interested in entering the field. We targeted professional developers already using or interested in using cloud computing for the storage and deployment of software.
Our client indicated that we should first reach out to the 36 users we had previously contacted for user interviews. These are professional developers of varying focus areas and skill sets. IBM granted us two $50 and five $20 gift cards to raffle off as an incentive to take the survey. After the first round of respondents, the client provided us with 141 additional emails to recruit from. Due to the low response rate, the client sent the survey invitation to the same 141. Two days later, our client sent an additional 90+ potential respondents the survey invitation.
Tools and Instruments
We used a Google Doc for remote brainstorming and shared with the client in order to solicit feedback regarding survey question design. We then transferred the survey into Qualtrics, a powerful software for survey design and deployment on the web. Each member of the team as well as the client piloted to survey to check for errors and ensure a good question flow for participants.
Through our analysis of survey data, we learned that respondents primarily use IBM Developer to take advantage of learning materials in the Technology Topics section.
While content interactivity isn’t a high priority to many of our respondents, companies like AWS and Adobe have both shown how content interactivity can be leveraged to improve community and user engagement, spurring retention and brand loyalty, where AWS holds interactive coding sessions on Twitch and Adobe has built up an entire community around its Daily Creative Challenges for XD, Photoshop, and Illustrator.
Out of the 5 features, we asked respondents to rank in terms of importance, website navigation was ranked the highest. After Technology Topics, respondents said they were most satisfied with navigation; this in contrast to earlier research where navigation was often noted as an area of improvement.
Findings & Recommendations
- Exploring “Technology Topics” is generally a positive experience
- Content interactivity can undergo improvements
- Recommend creating Daily Cloud Developer challenges
- Website navigation is the most important feature that users look for
- Recommend a card sorting study to better understand end user’s mental models
For the heuristic evaluation, our goal was to evaluate information architecture through menu structure and to gain insights into how users experience “Technology Topics”. To this end, we utilized Jakob Nielsen’s 10 fundamental usability heuristics.
After conducting our individual evaluations, we came together to align and synthesize violations and severity ratings through which we derived a few key findings:
- Information architecture presents opportunities for revision
- Navigation cues are lacking
- Search functions miss opportunities to pin relevant information
- Help and support are static and simple
As a result, we make the following recommendations:
- Revamp header menu navigation
- Introduce cues to support system status visibility
- Search results could be both optimized and tailored
- Site map creation and more dynamic, specific Q&A features
These recommendations are intended to support the project’s overarching goals, which are to bring fresh ideas to the new site, improve conversion rates for turning site visitors into paying customers and brand loyalists, and build a more engaging community.
We used Calendly to recruit usability test participants who considered themselves professional developers and had a relative familiarity with usability testing. These individuals came from diverse levels of development experience, age, and gender. We then scheduled remote meetings with our 5 participants via Google Meet, which allowed us to record the session for analysis and allowed participants to share their screen plus their webcam during the test so that we could correlate facial expressions with what they were looking at on screen. During each session, we asked participants to complete six tasks, compelling them to explore Cloud, Linux, Blockchain, AI & Watson, and Node.js.
We completed each usability test with a post-test questionnaire to help us develop findings and a survey the client designed for their parallel testing.
These usability tests yielded the following key findings:
- Visited links don’t provide feedback that they’ve been visited
- Internal names are confusing to users
- Tutorials, series, and courses are great content, well-liked by users, but lack a clear learning pathway
And the following recommendations were made in response to our findings:
- Change the color of visited links to provide feedback to the user as to where they’ve been
- Simplify naming conventions, supply users with an acronym decoder
- Implement gamification on learning pathways to improve user retention
Several limitations impacted our research. The methods we employed and the order we employed them as well as the time we had through which to conduct the research and analysis, were all decided by our instructional team, as this project was the result of a client-based course, “Need Assessment and Usability Evaluation” at University of Michigan School of Information Masters program.
Each one of the methods we employed made up a separate assignment over the semester, and the intention was to culminate in a final video presentation for the client as a wrap-up. For each assignment, we reported and presented our findings in greater detail. This case study is a summary overview of four months worth of work.
Next, COVID-19 restrictions caused us to move the project completely virtual in the middle of the project. While the project was intended to be fully remote from the start, campus closures led to our team having to move our collaboration and meetings completely online. Additionally, one of our team members contracted COVID-19 following an international trip for Spring Break, which further hindered our ability to formally wrap up the project.
The client, IBM Developer, was very supportive and understanding throughout this ordeal and expressed a desire to let us off the hook, so to speak, in terms of a final deliverable.
In conclusion, our recommendations address our key findings.
We produced an interaction map as a static representation of visual information architecture and interaction patterns. We created 3 provisional personas based on semi-structured user interviews. Following interviews, we conducted a comparative evaluation on multiple tiers of competitors spanning an array of dimensions involving features, interaction patterns, and business models which impact user experience. Next, we deployed a survey to gather data to better understand demographics and uncover relevant information in the context of IBM Developer’s: “Getting Started,” community engagement, and customer retention. To evaluate information architecture and to gain insights into how users experience “Technology Topics,” we conducted a heuristic analysis. Finally, we facilitated five usability tests to get feedback and derive insights on how usable the live site was for real users.