DLN Community Project - Quality Control Platform

In episode 186 of Destination Linux podcast (releasing today) we discuss the topic of Quality Control in open source. During the discussion, the crew raised the question whether our community could come together to solve this problem.

Problem: No unified community of testers for open source software that allows for developers to engage the community in order to perform testing of their software on variety or specific hardware and distros. DL has been approach by developers asking if we could use our community for such purposes in the past.

The current calls to community for testing from various distros happens on forums or tweets or other platforms with minimal engagement and exposure. Today there is often confusion on what exactly is needed for users to test and no information on the type of hardware or systems that the software is being tested on. Additionally, there is no unified output to help devs focus on the areas that impact the community most.

Proposed Solution: Leverage thousands of DLN users from around the world to sign up for testing on a web interface. Devs could submit a request for testing and narrow down testers if needed to: specific hardware, VM or bare metal testers, or include everyone to help resolve issues prior to release.

What’s Needed : Individuals with extensive experience in web platform development specific to database storage, search, and form submission. DLN would cover costs associated with website and servers.

What It COULD Look Like:
Website with a developer section where they can submit a request for testing. Form would allow for Dev to specify criteria to the type of testers they need (example: only testers with HiDPI monitors or only testers with AMD GPU’s)

An additional form for people to sign up for being a tester. Capture hardware information and additional info regarding their OS and whether they’re willing to test on bare metal or only a VM, etc.

Response form that testers fill out. Allow some customizable fields from devs. The responses are consolidated into a summary report.

Here is a very rough concept of just text language and potential form questions. ONLY A CONCEPT of text and language.

So what we want to do now is gauge interest and see if we can get some individuals willing to volunteer their skill-set to help build out. Let us know below.

Discussion from Destination Linux 186:


Very neat idea. I like it.

1 Like

This is so damn exciting, I’m in huge support and to the degree I can help people put this together i’m fully onboard.

I can help people with all things front end. I do a lot of very custom CSS/JS/Form handling ect on a regular basis.

I can help with the back end if it’s NodeJS (same deal, I do a lot of custom work in Node) but the platform should be the one most comfortable for the admin(s) and I may be up to learn some PHP/Python.

DLN community working on this: Just let me know how I can be helpful to you.


I like the idea. Have you spoken to some software maintainers about what they need?

Would this need to be hooked up to some version control system to track revisions automatically or is it simply private subforums for each tester group?

Regarding gathering the results of the testing, this can be done with various kinds of software:

User Feedback collection


  • Super simple to implement and easy for users to understand.


  • Very little guidance for testers and developers. Data is likely to be less usable.

User Surveys


  • Generate statistics and graphs, by supplying the same forms to every tester.


  • Requires more management than simple feedback systems.
  • Requires proper foresight and insight in order to get usable results.

Managed test suites


  • Get detailed information and statistics in a standardized manner.


  • Tends to carry a lot of overhead, both for the creator of the tests and for the testers.

Full-blown Customer Relations Management (CRM) software


  • Build closer relationships with your users by treating them as customers.


  • Managerial overhead. Since every tester is treated as a customer, manual work is needed in every case.
  • Will most likely need to be used in conjunction with other tools.
  • Typically focused around marketing and sales.

I did a bit of searching and found the following open source tools in the categories mentioned above.

User Feedback Collection software


Product Feedback

Fider can help you collect and prioritize product feedback so that you can focus on building the right product.


Track user feedback to build better products


Open source customer feedback tool

Astuto is a free, open source, self-hosted customer feedback tool. It helps you collect, manage and prioritize feedback from your users.

User Survey software


Sophisticated online survey software

Limesurvey is the number one open-source survey software.

Advanced features like branching and multiple question types make it a valuable partner for survey-creation.

Managed Test Suites


Open Source Test Management

TestLink is a web based test management and test execution system. It enables quality assurance teams to create and manage their test cases as well as to organize them into test plans. These test plans allow team members to execute test cases and track test results dynamically.

TuleApp (Test management)

Manage Automated and Manual Tests together

Deliver high quality software. Decrease incidents. With Tuleap Test Management, manage automated and manual tests at the same place. No more juggling multiple spreadsheets or diving into the depths of complex software. Test Management is now integrated in the whole product and software development lifecycle.

Customer Relations Management (CRM)


Grow Your Business Using the Most Flexible CRM

Our open-source CRM comes out-of-the-box with rich features for optimizing conversions, growing sales, and improving customer satisfaction.

Track which Leads convert into Opportunities and which Opportunities close into Customers. It’s time to understand your customer journey and provide a superior customer experience.


Open Source CRM Software Application for Businesses

Our feature-rich enterprise-ready alternative to Salesforce provides all the benefits of CRM at substantially lower costs with the freedoms and flexibility of Open Source.


Open source constituent management for non-profits, NGOs and advocacy organizations

Build, engage, and organize your constituents.

Get the powerful open source CRM used by more than 11,000 non-profits.


Free Self Hosted & Cloud CRM Software

EspoCRM is a web application that allows users to see, enter and evaluate all your company relationships regardless of the type. People, companies, projects or opportunities — all in an easy and intuitive interface.


The most innovative free CRM system

Flexible and efficient management for your company wherever you are in the world - this is what you gain by installing YetiForce. Where? Anywhere - on your computer, tablet, or smartphone with access to the Internet.


Ideally, this would all integrate with existing project management software, but I suspect that is too much to ask - at least at this early stage.

I feel that a CRM system is overkill and while test suites are nice, I suspect they are too much of a hassle for anyone to use.
See for example these guides on setting up testing in TestLink:

Doing some brainstorming, I have come up with a preliminary list of features for a “complete” product.

  • Tester/volunteer registration
  • Volunteer categorization (interests, skill-level, disabilities, language, hardware etc.)
  • Project registration
    • Project description, members, links to issue trackers and website etc.
  • Categorization of testers needed (based on skill-level, disabilities, language etc.)
  • Role based access
    • Site administrator
    • Project administrator
    • Project test builder
    • Project tester
  • Project feature requests and voting (this might already be handled by existing issue trackers)
  • Simple test definition (e.g. open the following 10 websites or start the application without internet access).
  • Surveys
    • Pre-made generic surveys (hardware, satisfaction, usability)
    • Surveys autogenerated from tests (e.g. if there is a test case for offline usage, automatically add questions regarding how the app handled that situation)
    • Custom questions in surveys - both for groups and individuals
  • Charts for representing survey results
  • Advertisement of current and upcoming alpha/beta tests.
  • Private and group messaging between developers and testers.

A simple proof of concept would be private subforums, with a hosted instance of LimeSurveys.

If a group of relatively popular projects will cooperate and provide feedback, a proper software project could be made with features more customized for this exact purpose.


A few thoughts… how do you feel about a minimal viable product for the first release with enough extensibility to get into the cooler stuff from there?

Objectives: Destination Linux 186: Quality Control in Linux, System76 Keyboard & DLN Game Fest! - YouTube

This needs to be refined and it’s skipping details but what I think the minimum is:

  • Email OTP sign-on so we don’t need to store passwords to start.
    ** Optional: A Discourse bot could send them in a forum PM
  • Basic account options for how/if someone wishes to be contacted
  • Package for sending Email alerts
  • Account privilege requests for devs, requires admin confirmation
  • Data search filters for devs
  • Basic looking data output including a raw option so devs can copy/paste it for internal use.
  • Way to push dev messages to helpers and handle replies separated by testing campaigns.
  • Minimal admin interface
  • Rich forms and CLI instructions for helping users share the environment information they wish to provide. < Critical
  • Live chat and other cool features to follow launch…

The idea being getting rich data is the paramount part of the project for first launch because that data is framework agnostic, it can be used with anything we add or change later.

Initial launch could even be solely for data collection so when the full platform goes live there’s a wealth of data behind it.


All great ideas, everyone of them. However, maybe we should let the DEVs tell us how they currently work and how we can integrate with their current process first. Kind of a keep it simple and then grow it as needed.

A simple list of testing requirements and a method to provide the appropriate feedback would be a good start.


Please keep it pretty simple and minimal. If this project is over-engineered, then it will be too cumbersome to be likeable. Sure, collect useful data about tests, testers, etc, but please pare back the required hurdles users must jump over, to only that which you really need. Seasoned developer input here is crucial.

Tester labour is precious, so please don’t annoy or beleaguer them any more than absolutely necessary.

Perhaps this Quality-Control project could be implemented as a modular add-on to this Discourse forum itself? So as to leverage all the user accounts which already exist here? Discourse does such a great job of providing a smooth experience. Couldn’t this just be extended somehow to have the functionality you want?


Long time ago I had an idea similar to this whole thing, but a bit more sci-fi.

I was thinking more in terms of a warehouse where multiple different PCs are already set up in racks and are connected through USB to some control board that simulates various input and storage devices and webcam would record the PC screen. That way developers could get a remote access to see how their software behaves on actual hardware.
Maybe warehouse would be a final stage of that, first step would probably be a single board computer controlling and recording PCs of various volunteers.
The single-board computer that controls the PC should also allow for some automation, so that developer can say test their software on multiple pieces of hardware in parallel.
But there would also have to be some security in place, so that PCs that volunteers offer for testing don’t get used for mining Bitcoin.

1 Like

Ryan posted the desired form inputs for testers/devs at the bottom: http://dasgeekcommunity.com/open-source-beta-testing-agreement/

I figure the option for more should be there but you’re right @esbeeb, needs to be simple by default.

A nice UI technique might be once the form’s complete they get the Submit button next to one saying “I want to fill out a bit more… (1/5)” and it progressively adds optional more advanced fields to the form each time it’s pressed. It could be coupled with an “advanced mode” checkbox which just reveals them all.

1 Like

Ooo-ooo! I have a suggestion. A Youtube URL field (or a URL field to some similar video service), where recorded videos showing problematic behavior can be linked to.

Since Youtube is so eager to allow anyone free access to their effectively infinite disk space, why not take advantage of that, and store bugs-as-video-clips there?

1 Like

I like the idea. I should be able to help with development, although my time is limited.

1 Like

I agree with this. It would be nice to hear from the devs who need this type of help so we can get a better idea on creating this platform.

I’m on board with this and am willing to contribute my development skills.


Seems to me simplicity is the goal instead of making it complicated. Just start off with a simple HTML/CSS/JS/PHP/MYSQL website sounds to me like it will handle all of your needs I can start on this at any point and time need someone else with a stronger back ground in JS and Frontend design but the rest Im pretty proficient at if your wanting to go the website path with this either way I would be down to help. Recommend doing a github repo with so many people interested in contributing.


Simple is better.

If you want to cover the back i’ll cover the front.

I’m thinking native CSS Grid, CSS driven transformation for mobile, no bloated frameworks or external calls and JS where JS belongs.

I can read any data format you can get onto a page or GET/POST response. I can work with whatever template you’re comfortable with.

I’m going to use a collaborative platform that supports people tweaking/playing/cloning my code on the fly with no account so there’s zero barrier for people getting involved and from there I can do pushes to gitlab/github (your choice).

Let me know which platforms you’d prefer to use for collaboration. Recommending CryptPad: Collaboration suite, encrypted and open-source as the G-word alternative.

COMMUNITY ASK: I can do basic art design but I could really do with a graphics designer, just a JPEG for site styling.


I can start on it later today around 7 est I’ll let you know which collaboration tool when I get home but I’m far from picky also I use to be a graphic designer so if we need some simple stuff let me know otherwise let someone else know cause I’m still getting use to inkscape

A few weeks ago, during the discussion about support vs bug reporting, we talked about making some easy interface for people to search tech support forums, knowledge bases and tutorial sites when they need help with their open source software, so I am planning on getting started with that under the domain tux.support (Sounds like tech support, but for Tux, get it? :wink: ).

I figure a user testing site would fit nicely within this domain.

I am also available for development on both front end and backend and I have set up a Gitlab Team and a project for User Testing (I am sure Michael can come up with a better name for the project).

I have been wanting to try out LeanTime for some simple project/idea management, so I can set up an instance today, if you like.

But first and foremost, it is important to get some seasoned developers that actually need testers for their projects to give their feedback regarding what they need. The code is the easy part. It’s actually getting adoption that will be tricky.

Would you like to collab with mudlight48 and I? I’m open for discussion any time which works for you though it’d be easier on something like Mumble. Set a time today or later in the week and i’ll be there.

My thinking is building the form to Ryan’s guidelines creates something for devs to chew on for how it can be improved. We don’t need to commit to a particular dataset early nor does the form have to submit till everyone’s comfortable with whats on it and having it in visual space goes a long way.

I would be happy to collaborate with you and Mumble sounds fine.

I can be there today (that would be tonight for me, as it is 10:16 pm here now), but I can’t stay for too long today. Most days it wouldn’t be a problem with the time, but I’m going on a little fishing trip tomorrow morning and I am being picked up 8 hours after the meeting starts (and hopefully I can squeeze in some sleep in between :slight_smile:)

I agree on the importance of a proof of concept project. I would just hate to see it getting too far down a wrong path.

If our form isn’t easy to change we’ll know we screwed up :stuck_out_tongue:

Be on mumble in a sec.