diff --git a/examples/timing_exam.rs b/examples/timing_exam.rs index fe69fb2d4a4865faef4c2c85ddf4c3e63204c4af..682a9405569fccf6fa22749e187f3e3c33a96805 100644 --- a/examples/timing_exam.rs +++ b/examples/timing_exam.rs @@ -401,5 +401,25 @@ const APP: () = { // - How would an ideal tool for static analysis of RTIC models look like. // // [Your ideas and reflections here] +// RTIC has very little overhead compared to other solutions that use a thread based approach. In +// real time system it is very important that the overhead is small and that the system behavior is +// easy to predict. RTIC is very good at both, but the thread based approach is much worse in both +// aspects. It is much harder to follow the task execution order in a thread based system because +// it has less restrictions. It is also because it has less restrictions that the overhead is +// larger. +// +// From this exam it is clear that the theoretical model is very close to the measured results, +// but there are some small differences caused by the inaccuracy of machines that can make a big +// difference. These inaccuracies are extremely hard to account for because there are so many factors +// that can change the result. And as we can see from this exam a difference of just a couple of +// cycles can cause the result to be way different. +// +// I think the ideal tool would have to take the overhead into account in some way. The exact +// overhead is hard to know, thus i think the tool should give 2 sets of analysis. One that is the +// pure theoretical analysis and this could be seen as the lower bound. Then a second one that +// takes assumes that the overhead is large enough to cause the analysis to be different and this +// can be seen as the worst case. From this we could also get the needed overhead for the second +// analysis to happen. This gives a good understanding of the behavior of the system and how much +// overhead it tolerates before it becomes worse then it theoretically should be. // // Commit your thoughts, we will discuss further when we meet.