Page 1 of 1

Rob's X16 Ecosystem Suggestions

Posted: Fri Aug 20, 2021 3:09 pm
by rje

I've thought about this and struggled with multiple priorities, but I found others' suggestions to be very helpful.  I feel clearer about the situation.  So here are my suggestions, submitted to y'all for constructive criticism.

 


$00. Make the REFERENCE version ** COTS **, and produce kits on a slow deliberate schedule.



  • The core team is not allowed to burn out.




 


$01. Do what's right for the ECOSYSTEM.  


 


$02. Make the PRODUCT as cheap and accessible as possible.  [Tramiel]



  • Tramiel was right with the C64.




 


$03. Release an "X8 card" for Commodore hardware.   



  • This gives back to the Commodore community, and promotes X16 concepts.




 


 


Rob's X16 Ecosystem Suggestions

Posted: Fri Aug 20, 2021 7:03 pm
by Wavicle

If I were to feed suggestions back to the dev team based on my professional experience shipping products, the changes I'd make would focus on:


  1. Reduce speed


  2. Reduce parts / cost / complexity


    1. But maybe increase parts / cost / complexity a little where it makes sense




  3. Make your first prototypes DFD


I understand the wish for a "dream computer", but unless one is experienced shipping computers in this space with these parts, scaling the dream back can help bring things down to reality. The items for #2 are architecture-breaking changes for the current design; so they are maybe nothing more than a footnote -- it can be difficult knowing when to abandon a sunk cost vs. the whole project.

Some specific changes I would have made:

Reduce speed


  • The timing margins for 8MHz seem to be driving some decisions, but those margins are just too tight for a V1 design. Call it a day, abandon 8MHz and revisit any decisions or ideas that were made/changed/abandoned so that 8MHz was a possibility. Instead of considering 2/4/8 MHz as possibilities, consider 1/2/4 (or just 2/4) instead.


  • Same with other high performance items that seem to be hindering more than helping (e.g. SD card hardware on VERA). A slower implementation that can ship now is probably better than a faster one that can't.


Reduce parts / cost


  • I love the idea of 2MB of RAM on an 8 bit machine, but supporting that is driving up cost and complexity. Could 5 SRAM chips (1x 512kbit, 4x 4Mbit) be reduced to just 1 (1x 4Mbit)? Even if this requires losing a big chunk of the SRAM space because it overlaps with ROM, each SRAM is $5-$6 in parts and ~2 square inches of board space.


  • If not using a PAL/GAL like C64 for address decoding, simplify the address space. Nobody likes the idea giving up 4K of low RAM for IO space, but doing so makes address decoding much simpler and reduces the discrete logic ICs that are needed. The dream was to make a board that a casual enthusiast / younger person could understand, so double down on that.


  • Remove the RTC but maybe reserve IO space for it. Let it be an add-on card. Spec the interface then put the implementation off until the future or let 3rd parties make an RTC board.


  • Reconsider the microcontroller that manages PWR_OK / PS_ON# sequencing. Is this something that could be done with an SR latch and edge detector? Is a PSU fault a sufficiently high risk that anything other than switch debouncing is needed? The temptation when adding a shiny new component is that once it is there, you'll creep features into it. If I recall, that microcontroller additionally monitors all buttons; asserts NMI; talks to the RTC; and manages front panel LEDs. It also sounds like it is one of the reasons a V4 prototype board will be needed.


  • Don't replace VERA's FPGA, but maybe pull the SD card out if it isn't working. As with the microcontroller, an FPGA is a shiny component that invites feature creep. Some deadlock issue is impacting the SD controller on VERA and a bigger FPGA is being looked at, but that FPGA has supply chain issues. Based on what has been said, I'm guessing it is probably iCE40HX which is difficult to source unless you can wait until 2022; VERA is a working graphics card and Mouser has 577 iCE40UP5K QPFN48 parts in stock right now. Use what works and can be built now.


  • Possibly move ROM/RAM bank select back to the VIA. The performance benefit of having it mapped to $00/$01 is awesome, but it also looks like a heap of more complexity, more parts, and more board space. If there were a bank select header shared with the VIA lines, then fast bank switching could move to a future expansion daughterboard that adds the most wanted features that were cut back in.


  • Remove extra stuff from ROM - maybe. If anything not required for shipping the product (e.g. GEOS) is distracting someone who would otherwise be working on issues that are ship-blockers, cut it. If not, leave it in but have a clear path to tear it out if it is ever holding the product up.


  • Reduce audio to one component. If the sound capabilities of VERA are too attractive to give up, ditch the YM2151. If not ditching the YM2151, maybe the freed up FPGA LEs on the existing part could be used to work around the SD deadlock issue.


  • Remove the S-Video and composite out from VERA. The still frames of VERA aren't high enough resolution for me to tell what video encoder VERA is using, but there aren't many parallel to composite encoders around and the cheapest are $9-$15 which is 2-3x the cost of the FPGA at the heart of VERA. I am certain that if composite and S-Video are there, some people will use them, but I suspect that more than 90% of customers will use VGA 100% of the time. Maybe add an expansion header so the functionality can be added later. This could also allow a 3rd party to make an HDMI add-on board and that 3rd party can figure out licensing.


But maybe increase parts / cost / complexity a little where it makes sense


  • It isn't a computer anyone will buy for nostalgia without a keyboard and it sounds like bit-banging the PS/2 port isn't working reliably. This is a place where a microcontroller might be called for, but only for IO-related tasks. The temptation might be to move I2C functionality to that microcontroller, and maybe that makes sense - but if the RTC is removed the temptation will be to add it back in using that IO microcontroller. That's the feature creep to be vigilant of. Add an I2C header that an RTC daughterboard could be attached to if necessary and solve that problem later.


Make your first prototypes DFD


  • I can't share images of what prototype boards at work look like, but in general they are much larger than anything we would ship and designed from the ground up for probing and patching. In many cases whole feature blocks of functionality will be on a separate board so that as issues are found and fixed, only that one board needs to be remanufactured; or if a bad rework damages something the whole board isn't written off. It's a higher upfront cost that always seems to pay for itself. The easier a signal is to probe, the more likely you are to put a probe on it "just in case". (To be fair, in my work environment the single largest expense is "engineer time" so the calculus in the X16's case may be dramatically different.)



Rob's X16 Ecosystem Suggestions

Posted: Fri Aug 20, 2021 11:50 pm
by Kalvan

My design for my planned dream retro computer to produce as (rough) competition to the Commander X16 would have used an Intel 8048/51 for the keyboard interface.  It's both a period solution (used at least as far back as the original IBM PC 5150) , and one with plenty of new old stock and even allegedly (according to Wikipedia) new production.

Of course, maybe the dev team has already considered and eliminated this possibility...


Rob's X16 Ecosystem Suggestions

Posted: Sat Aug 21, 2021 6:09 am
by BruceMcF


11 hours ago, Wavicle said:




If I were to feed suggestions back to the dev team based on my professional experience shipping products, the changes I'd make would focus on: ...



Reduce speed




  • The timing margins for 8MHz seem to be driving some decisions, but those margins are just too tight for a V1 design. Call it a day, abandon 8MHz and revisit any decisions or ideas that were made/changed/abandoned so that 8MHz was a possibility. Instead of considering 2/4/8 MHz as possibilities, consider 1/2/4 (or just 2/4) instead.




Perhaps the 8MHz was problematic in some respects, but just because we can't see how they cracked those problems, I think they've already cracked the fundamental problems, and are on software issues, like the time out on the PS/2 keyboard timing out too soon when the time out loop runs at 8MHz.

And if the ONLY problem with 8MHz is the PS/2 port, then just run the PS/2 port that way it was run on the original PS/2 in the late 80s, with an 8bit microcontroller. The one they've already added only needs the version with more lines to take over that responsibility.

And fixing what ain't broke is not the way to bring the system to market soonest.


6 hours ago, Kalvan said:




My design for my planned dream retro computer to produce as (rough) competition to the Commander X16 would have used an Intel 8048/51 for the keyboard interface.  It's both a period solution (used at least as far back as the original IBM PC 5150) , and one with plenty of new old stock and even allegedly (according to Wikipedia) new production.



Of course, maybe the dev team has already considered and eliminated this possibility...



They already have an 8bit microcontroller available as new stock in quantity to handle power up ... the hardware designer has already set it up to use the version of that microcontroller with more lines to handle the PS/2 interface if the VIA approach doesn't work out. So I think that is already plan B, except not with the 8048/51.