We are coming to an end of our Lab/Studio project, and one of the last things we needed to do was create a report summing up our progress.


Our main goal was to create our own customised Kali Linux distribution. Before heading down that path we briefly explored other options based around the same idea. There was a discussion to develop something out of Memex, which is a tracking tool for human trafficking, but that was scrapped early on. The majority of the group wanted to do something with Kali Linux, and so we discussed different directions such a project could take before creating any specific plans. One direction discussed was to create a website, that would act as a showcase and show our proficiency with the tools. Another direction that was discussed was to evaluate tools and do a printable report based on the results we found, possibly in a “guide/tutorial” style. While discussing these different directions we discovered a somewhat ambitious but exciting goal, that was to create our own custom distribution, with only the best tools aimed at digital forensics.

Main objectives:

  • Evaluate tools
  • Create distribution

Our project boils down to the two objectives listed above, none of which were completely finished. Regardless of the failed objectives – much was still achieved. In this report we will go into more detail on our challenges and achievements while working towards these two objectives. As for the final deliverable, it comes in the form of the evaluations we were able to do, and the statistics that surround them. At this point in time that isn’t very usable, but we feel we have managed to create a solid base for a possible future project.

Creating evaluation forms

The creation process for the evaluation form was quite simple: what can be considered useful for a forensic investigator? As it appeared, a large number of the tools present in Kali are intended for penetration testing and “attack” purposes. We needed to investigate the tools, evaluate them for what their purpose and use is, and then single out which ones could be used for forensic work; meaning information and evidence gathering and analysis. While some tools would be harder to clarify the purpose of, most were easily identified by just using common sense.

The group sat down and authored an evaluation form that was both simple and covering the desired traits, with the following criteria:

  • User interface
  • Dependability
  • Usability
  • Desired functions
  • Performance
  • Overall impression

These criteria would together be used as a basis for giving the tool a recommendation to include or not, as the group member saw fit.

We would not compare the different tools to each other to discover which was best suited, as some tools might perform better for some purposes and worse for some others, only recommend to include the tool or not in our distribution.

As a validation of the evaluation, we also included the option to put “maybe” on the recommendation, in which case we would look at the tool as a group and make a united decision.

Evaluation results

Our most valued lesson in this process was the ability to adapt and learn from the process of evaluating new software/tools. By looking at the data collected from a total of 83 tools (out of 300) we could clearly see that a large amount of them were not suitable for our purpose. This also despite the fact that most of the tools we were able to finish evaluating were forensics product.

Here are some other results from the evaluation process corresponding with the initial criteria.














Note: Variables were sometimes needed, like having the option of CLI in the User Interface evaluation. As one can see, 54,2% of the tools evaluated had only this option available. Take note that the Overall impression results are a bit higher (63,8% for average or better) than our final evaluation of Recommended tools (45,8% – see below).

The reason for this percentage is because many tools might be good tools for their purpose, but not within our scope for strict forensics purposes. However, we do have a portion of “Maybe”-tools that were marked for later discussion. These could potentially be included tools later on.

Lastly, what we took from this evaluation is that although something is labeled as forensics, does not mean the tool is work appropriate. To apply critical thinking and informed evaluations, we were able to discard more than half of the tools in the toolkit.


We did encounter a few challenges along the way, with varying results on our work and group.

Testing the tools without breaking laws or ethical boundaries: this issue was a major concern at the start of the project, but since the evaluation process took longer than expected, it in some way solved itself, as there was not much time left for testing. We instead opted for reading online test that others had performed.

Underestimated the time needed for completion: this was by far our greatest challenge, in that we underestimated the time needed for evaluation and review, and the time needed to learn how to create a distro, not to mention create the distro itself. In the end, we did not have time to even start the creation of a distro, but we did successfully create and apply an evaluation scheme.

Flat group structure did not work: our original group structure was more cooperative, where no member had a defined role, but rather shared the workload and decision making. While quite utopian and fair, it did not work in the end, with no motivator to keep the group on-task and performing. When the first report (Starting the portfolio) was approaching, we realised our inefficient methods, and quickly shifted the organisational structure to hierarchical, with Carl Philip as group leader. This proved much more effective, and we kept this structure throughout the project.

While these larger challenges changed our end result and group, the smaller challenges such as workload distribution, direction and goals were handled well within the group. All things considered, we are satisfied with how our group performed, and our levels of cooperation and group cohesion.

Organisation and management

For the last couple of months working on this project we restructured the group. The way we were working was not ideal, so we appointed a project manager to get things back on track. The project manager divided work among the team and pushed the project forwards. We didn’t utilise any Gantt chart or other methods toward the end, but communication through Skype worked better than before. The project manager decided to drop all the project management tools as they merely got in the way. A ‘hands on’ approach was taken instead, and showed to be much more effective for getting the group going. For writing this report we switched to a shared solution, namely google docs. This enabled us to write simultaneously and give feedback to other team members easily.

Despite our restructuring and increased efforts we weren’t able to reach all our deadlines. Regardless of this fact, the group work has been successful. In the end we managed to fix our organisation and communication issues, and working together was like running downhill rather than uphill.

If we were to do it all again we would surely agree upon a project manager from the start of the project, that would help a lot. At certain points during this project it felt like we were doing more planning than actual work, that is something to keep in mind for the future.

Legal – Social – Ethical

Since our last delivery we have been working towards creating our own customised distribution of Kali Linux. While pursuing that goal we encountered some previously unknown issues.

Navigating the legal seas of Linux software and their license agreements can be quite a mess. Initially we assumed that all of Kali Linux was open source and released under the GNU GPL («which guarantees end users the freedom to run, study, share and modify the software»). This comes forth in the Kali Linux Open Source policy, which states that “all of the specific developments in Kali Linux’s infrastructure or its integration with the included software have been put under the GNU GPL”. There are however a selection of non-free packages whose licenses should be reviewed before use. These are packages that Offensive Security have agreements with the vendors to redistribute, and strongly advise 3rd parties to review their original licenses before use. Many of these packages are for certain chipsets, network cards, or graphics cards. We have merely glanced over the list, searching for terms such as “license” and “Debian” in hopes of finding a potential violation. Understanding what all these packages do and what implications they have on the system is a difficult and time consuming task that we have not prioritised. Some of them are simply custom fonts that are licensed differently, and fonts are likely easy to exclude in a custom build, but there are packages in there that could potentially hinder our distribution if excluded.

The bottom line is that if we never plan to release our distribution to the public, we don’t really have to worry about licensing any of these closed source packages. The issue is only when redistributing or selling, not personal use.

A possibly more serious legal matter is our testing environment for the tools. The Kali Linux distribution is entirely legal in its own right. It’s simply a collection of publicly available tools aimed at auditing the security of various systems. While many of the tools can be tested in the safe harbour of a virtual machine, some need a real lab to be tested legally. One example is the testing of wireless penetration testing tools. While that wasn’t the main focus of our project, it’s worth keeping in mind when testing some tools. Your home router is usually only on rent/lease from your ISP, so there’s quite a big chance that we would violate their terms of use if we were to run tests on our own routers. Thankfully for some of the tasks we didn’t manage to set up on our own virtual machines, there exists online lab solutions where security tools can be tested.

While working on this project we have done our best to maintain an ethical standpoint. Kali Linux and the tools within it are certainly in use by malicious actors every day, but we have to believe that for every black hat there’s at least two white hats. These tools are marketed as simple to use and easy to install, but we like to believe that many “want-to-be” malicious actors give up before managing to boot up a live distribution. The fact that vendors and universities around the world offer courses and exams in Kali Linux gives us the ethical ground we need to feel comfortable working with this software.

We didn’t face any specific social issues while working on this project, but there are some that can be discussed. “Hacking” has become a big part of the mainstream media, and using any tool in Kali Linux would qualify as hacking by that standard. Security awareness is a very important topic today, considering all the potential threats out there. Our project becomes two-faced in this matter, because “hackers” are sometimes looked down upon because people don’t like to get their data stolen. On the other hand, we are raising awareness with this project, by showing how simple it is to perform basic but powerful attacks. If people are aware of how easy it is to perform malicious acts with this software, maybe they will think twice before not securing their network or using weak passwords.

Outcome and achievements

Firstly, we successfully created and implemented an evaluation scheme, investigated and evaluated tools within Kali Linux, and in doing so increased our knowledge and experience of both the tools themselves and the evaluation and review process. In the real world of digital investigation we will likely be asked to research and evaluate new software/tools, and this learning process has enabled us to achieve a greater understanding of how we evaluate and learn about such tools. And while the end goal of creating our own Kali Linux Forensic Edition distribution did not come to fruition, we consider the project to be a success because of the level of learning we achieved.

Secondly, and perhaps most importantly: we created and managed a workgroup, and while it was a learning experience for us all, with a few challenges along the way, we feel the group performed very well, especially when it came down to crunch-time. We learned a great deal about group management, dynamics, cohesion, motivation and leadership, and if given a new project in the future, we feel confident that our group would perform even better. We leave the project feeling proud of our group’s achievements, more knowledgeable than we were before, and having built lasting friendships.


The project itself started slow and felt ‘unguided’ until the restructuring of the deliverables, with this change our motivation returned. Our primary goal was to create a new Linux distro targeting digital forensics and investigators. The project turned out to be too time-consuming compared to what we first assumed. This was also a question we brought up at the first presentation held at NUC in Kristiansand earlier in the project, and it turned out to be true.

Our aim didn’t really change during the project, but we learned a lot more than we initially thought. Especially about the different tools available in Kali Linux as we got a more “in depth” analysis while working on our evaluations. The next challenge was how time consuming the project turned out to be, and we didn’t realise this before we started the application evaluation, and as a result of this the distro was never completed.

Apart from the “slight” overreach in ambition, we did not encounter any further large challenges or problems that couldn’t be solved directly through group communication and leadership. Small issues such as working around each group member’s schedule, workload distribution and tasking were handled continuously.

Even though our group only partially achieved our aim, we are still very satisfied with the end-result. We have also become great friends, and our Skype chat will still continue even after the project has ended.