Me (young student with little experience) and a colleague of mine (multiple years of hardware development) are sort of clashing about a certain topic, but I'm having trouble to understand his point of view from his wording alone.
I recently heard about Mikroe's Planet Debug, which is a development product, dedicated to hardware developers, enabling them to do remote development on real hardware. There's a bunch of stuff in that, but the key points for me are:
- MCU cards which are (IMO) independent of peripherals, thanks to a standardized socket (MikroBus)
- Development boards which utilize those standardized sockets to connect them to peripherals.
My point in the discussion is:
Having such a standardized socket would enable us to develop faster with less dependency on our production facility, because we could develop a single proprietary development board for our products which utilizes a standardized socket. With such a development board, we wouldn't be dependent on VERY SPECIFIC MCUs, but rather could test a whole range by "just swapping out the MCU card".
His counter-point is:
There is no point in that, because it doesn't consider the dependency of peripherals. We would still need all the peripherals and at that point we might as well use a specific MCU in a specific package. Like it doesn't solve anything if we were to use a development board with "replacable MCU". That Mikroe stuff only works, because they're selling their product along with it and for our proprietary products it's as good as garbage.
I do want to point out that I'm paraphrasing his and my points, because it's not feasible for me to do precise translation of that verbal discussion from the top of my head.
His counter-point is confusing me, because he keeps repeating the part of "dependency on peripherals", as in: "Any benefit that only concerns the MCU is pointless, you always have to consider the full product"
At this point I have to mention, that we're in an industry, where products have a minimum lifetime of 10 years, if not more. Our product line is "new" but still already 10 years old.
I don't quite understand how it wouldn't be beneficial to use or build a propriatary platform. IMO it potentially enables us to do testing earlier or at least enables us to switch MCUs without requiring a complete redesign of the PCB. In my opinion it would take away dependency of having to produce a full populated circuit board before you can begin with any sort of testing if the firmware will work on it.
But then again, I am young, naive, inexperienced, but in that regard also willing to be shown otherwise. It's just that my colleague isn't the type of person to give detailed explanations, but rather smacks the argument with "It's just how it is done, it doesn't work your way."
EDIT:
When developing a product, that is meant to stay for a minimum of 10 years, the dependency on the lifetime of a certain part is very crucial. With our product, we are aware of the costs and implications of having to change out discontinued parts.
Keeping that in mind, I thought it would improve our product development process and our internal costs, if we attempt to make crucial parts swappable for our development team, so that we don't have to produce a fully populated circuit board with MCU and peripherals in order to start developing firmware or even do testing.
My question is, if that's actually beneficial, to basically develop 2 boards, one for development and another for production, or if that process would not produce any benefit, since it is actually "pointless" as my colleague pointed out. More specifically, I am asking about WHY that is or is not beneficial, because my colleague would not explain it to me in detail.