Game Career Guide is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Get the latest Education e-news
 
  • Postmortem: Cybervisulization

    [04.27.21]
    - Justin Neft
  • Over the course of 8 weeks we developed a data-visualization tool for cybersecurity competitions. The intent was to describe and present the events of a cybersecurity competition to an uninformed audience. The tool is meant to be deployed with a casting team over streaming services like YouTube or Twitch. Casters can update the data and modify the view being broadcasted live to help explain and clarify the events of the competition. We initially started building our tool for the Collegiate Penetration Testing Competition (CPTC). Our final product was built for and broadcast during the Northeast Collegiate Cyber Defense Competition (NECCDC).

    Project Details

    Original timeline was 15 weeks.

    Actual timeline was 8 weeks.

    No budget.

    Developed in Unity version 2020.2.4f1

    Streamed using Twitch

    Developed by a team of three:

    • Robley Evans (Part time): Data pulling, AI for selecting what information to display, data formatting, programming

    • Justin Neft (Full time): Infrastructure-building, attacks, node state reading, programming, design

    • Kevin Laporte (Full time): Injects, videos, time simulation, random team generator, programming, design

    What Went Right

    Development:

    • We had good momentum and were able to generate good ideas and results with little outside direction.

    • We made good design decisions early on and stuck with them

    • We were able to easily shift directions when a major change in direction occurred

    • We were able to make quick decisions about what we needed to do next

    • We fixed bugs as they were discovered

    • Three people made it happen

    • The project was scoped well; no main features were cut, but some stretch goals were

    Product Launch/Broadcast:

    • Our tool was successful in reading in live data, interpreting it, and displaying it

    • The team was able to quickly adapt (tool and production) on the fly to keep the show going

    • We had an average viewership of 40 people throughout a 9.5 hour livestream

    Throughout this project we found that several things went in our favor during both the development and broadcast. These helped us complete our project in time and put on an engaging show for NECCDC. The development process was roughly 8 weeks, starting from the beginning of the semester in January 2021 to the start of the 2021 NECCDC. The broadcasting portion was during NECCDC itself. The first three weeks of the development process were spent developing for the CPTC before we switched to building for NECCDC.

    Within the first few days we were able to determine a short term goal and work towards it. This helped us produce results quickly. There were a few dips in productivity when we switched competitions, but we were able to quickly bounce back. Because of this momentum, we were always able to keep busy working on improvements, bug fixes, or new features for the visualizer. This was especially helpful during the later stages of development when waiting on things from other teams became commonplace. To help keep the momentum, we made sure to fix bugs when they were discovered rather than waiting to solve them later. This ensured that major bugs didn't slow down development and minor bugs didn't have time to become bigger problems.

    Our agility was due in part to the resources provided by previous teams as well as the size of our team. Previous teams had produced a lot of reference and proof of concept material. This made it easy to get up to speed with the project and its goals. We started with 2 full time students (Justin Neft and Kevin Laporte) which aided in our ability to quickly get the whole team on the same page. Once we transitioned to designing for NECCDC, a part time student (Robley Evans) who had worked on the project previously also joined the team. Being able to integrate knowledge directly from previous teams was immensely helpful in making good design decisions early. Our small team was very agile, able to think quickly on our course of direction, and make changes faster than a larger team may have been able to.

    The transition from CPTC to NECCDC also went fairly smoothly. CPTC and NECCDC require similar forms of visualization, such as networks and systems. Thus many features we had already developed were able to be ported or reused. This prevented a complete loss of three weeks of development time. With a definitive deadline, we were also able to plan out a timeline. We first broke down what features we had and what features we needed. We estimated how much time each feature would take, then doubled it. This doubled timeline was surprisingly accurate and allowed us to get all the core features implemented by the deadline. Stretch goals were the only elements explicitly cut during development.

    The good aspects didn't stop at just the development cycle. Despite the broadcast being a very spontaneous production, many things went in our favor. The first major point in the broadcast was that our visualizer successfully completed its main task: reading in real and live data from the competition and displaying it. This was something that previous teams were unable to test since they did not have access to real competition data. The competition proved that the concepts both previous teams and our own team had worked on were viable.

     As with every product launch, there were plenty of issues. Without a proper dry run, many of those issues were discovered live. Fortunately our team had the skills necessary to quickly resolve these issues on the fly with minimal downtime. We were quick to go live and inform the audience when major problems arose. We made sure to stay open with the issues and the fact that this was all a very new and experimental product. Our team also made great use of the downtime during interviews to fix major bugs while the visualizer was off screen.

    Many of the casted segments and interviews were planned mere hours beforehand. Despite the short notice, they were well received and helped bolster the livestream's educational value. Our team had the benefit of being close friends and were able to naturally talk with each other during casted segments despite having little broadcast experience. Our stream consistently held an average of 40 viewers, with numbers rising upwards of 100 during interviews and casted segments. As a proof-of-concept for creating an interesting live stream out of a cyber security competition, we believe these numbers are proof of its potential.

Comments

comments powered by Disqus