Game Career Guide is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Get the latest Education e-news
 
  • Postmortem: Cybervisulization

    [04.27.21]
    - Justin Neft

  • In terms of the broadcast, the first major issue that arose was the lack of a comprehensive test run. This was due in part to not having real data until the competition. We had done plenty of feature testing throughout development. However, small feature tests don't always replicate practical applications. Multiple problems arose during the live show that could have been discovered during a full test run. These include things such as videos playing at the wrong times, data being pulled from the wrong date, and many audio issues. We also decided not to visualize attacks using features we had built because of a lack of visualization content. Due to some of these issues, multiple features were cut from the visualizer during the live stream. This was not ideal, but we were able to keep the show running despite these hiccups.

    Crunch time was also a very real part of the project. Although we had planned out a lot of requirements beforehand and even doubled our expected timeline, we had not anticipated room for error or need for unexpected features. One of those errors was the fact that we did not get real data until shortly before the competition. Since the format of the data we were receiving was different from what we expected, we had to make quick decisions shortly before we went live. These decisions influenced what systems we were tracking in the competition, and thus what data was being visualized. The decisions were mostly uninformed and wildly based on "what sounds more important to you?" Another unfortunate side effect of crunch time was the shift from making code clean and efficient to making code that just worked.

    The final major issue that arose was the lack of a full broadcasting team. Our team's original job description was to be software developers for the visualizer. As the competition loomed closer, we quickly filled the roles of commentators and broadcasting team. Although we are far from experienced broadcasters, we were able to fill both the role of developer by maintaining and fixing the tool live while also being broadcasters and producing an interesting and engaging live stream. However, if we were able to solely focus on development, we would have had more time to better finalize the visualizer for its debut.

    Risk Management

    Development was played fairly-safe with decent time management and low amounts of risk taking. The production side, however, was completely unplanned. As such, we took many risks to try and make the broadcast good. Overall these production risks paid-off in a major way and really helped our tool shine. The production also gave us great insights into how to further develop our tool and what it really needs to be complete.

    Mid-Project Changes

    Our first major change was having to pivot from developing our tool for CPTC to developing our tool for NECCDC. The pains of this change were mitigated mainly because the tools we had developed to that point were easily-ported to the new competition format. This meant minimal code refactoring was required during the shirt. The real work for this shift came from changing our plans and designing new interfaces. We handled this by planning out what exactly we would need to do for NECCDC. We then broke down what features we already had from CPTC and what we would still need. Once that was done, we calculated how long it would take to implement all of these features and doubled that time to high-ball an estimated time to complete the tool. The estimation was vital in our determination of what was necessary and what was a stretch goal.

    Our second major change came when we decided to handle production for the broadcasting while also running our tool at the same time. This change meant we had to quickly create content for the live streams while maintaining our tool and fixing any unforeseen issues that arose throughout the competition. Unfortunately, we did not have much time to prepare for this change. We had to move quickly to make the stream work while finding our footing throughout the production.

    Conclusions

    In review, we think that our team really suffered from a lack of strong project management and direction. We didn't have great ways to keep track of what tasks needed to get done and we were quite informal with our processes for completing new tasks. Stronger project management and review processes would've likely allowed us to keep the correct direction throughout development.

    A lot of this development cycle was us figuring out what needed to get done, who we were developing for, and what exactly we were even making. While the lack of answers gave us a good deal of flexibility and autonomy, we think it led to inefficiencies throughout the entire process and greater confusion. However, we also learned a lot through this process allowing us to structure our future work better. These lessons also leave us better positioned to set-up the project for future teams.This will allow them to have an easier time learning the project and improving and modifying it in the future.

    Plan of Action

    With this knowledge in mind, we're using project management tools and methods to help keep the team on track and focused. We are also contacting experts in software engineering, cybersecurity, and project management to help guide us in setting-up these workflows to be efficient and effective. Finally, we're getting in contact with the CPTC staff now so everyone has more time to prepare the competition with integrating the tool in mind. This will also help give persistent connections with CPTC to future co-op teams working on the project.

Comments

comments powered by Disqus