Jump to content
IGNORED

New Atari Console that Ataribox?


Goochman

Recommended Posts

Yes, no headphone jack, no USB-A port (what if you want to use a Square reader with the new iPad, or charge and listen to headphones simultaneously?), because adding the ports in would break the contour lines of the device. That's form over function if there ever was such a thing.

Are you talking about last year's iPhone 7? In the unlikely event you need to charge, use wired headphones, AND run credit cards at the same time, you pull one thing out of the lightning port and put the other thing in.

 

Just like if you "needed" to play Frogger AND Super Cobra AND Word Zapper on your Atari VCS at the same time, but only have the one cartridge port. Adapt or die.

 

Square can use the audio jack adapter, too. https://squareup.com/townsquare/go-wireless-with-the-new-square-reader-and-your-iphone-7

Link to comment
Share on other sites

Are you talking about last year's iPhone 7? In the unlikely event you need to charge, use wired headphones, AND run credit cards at the same time, you pull one thing out of the lightning port and put the other thing in.

Nonono. It needs to do all connectors at once. You can't have USB, audio, and power receptacle at the same time. It is a flawed design. And their laptops are designed the same way. What good is a PC that can multitask but not connect to multiple devices simultaneously?

 

From the company who first removed the floppy, removed the CD-ROM, removed the headphone jack, removed the USB, and stripped all I/O to a single jack. And sell you a dozen connectors. Want to have HDMI? Buy a connector. Wanna have wired headphones? Buy a connector. Ethernet? Buy a connector. USB? Buy a connector. Wall charge? Disconnect everything else. Want all of those simultaneously? You're out of luck.

 

Meanwhile my old Windoze laptop had four USBA ports, one of them 3.0, HDMI, VGA, ethernet, dedicated barrel jack for power, a DVDRW bay, headphones, microphone, and if I wanted to stick a peripheral into every port on the system and use them simultaneously, I could, without spending a dime on accessory connectors or dongles. Apple's minimalist aproach to design and proprietary ports will be their downfall. Designers who use their iMacs for work are already beyond frustrated by the endless daisychain of expensive dongles necessary to get work done.

 

Still not as bad as Microsofts "I'll restart your PC and lose all current work whether you want to or not" approach to Windows 10 and the endless update buffet. :roll:

  • Like 1
Link to comment
Share on other sites

 

That's right.

 

I'm not sure exactly when the 1 programmer = 1 game model went away. Maybe it was sometime around when franchises like Mario or Sonic were popular.

 

Maybe it was the 16-bit era? Anything more and the complexity of the machines demanded separate teams for sound, graphics, art, gameplay logic, and god knows what else.

 

---

 

On the other hand, with the 32 and 64 bit machines and the PC. One could say a lot of avenues had been opened up. There was enough "machine" to support work in many disciplines. The programmer turned into a director that orchestrated art and logic. And soon became known as a developer. And even the term "developer" expanded yet again to mean a development house consisting of many kinds of talent.

 

Probably when Imagic started paying artists to draw on grid paper so that the graphics would look better than when the programmers were just coloring in the grid paper with solid blocks?

  • Like 1
Link to comment
Share on other sites

I suppose there's nothing wrong with big teams working on games, I just wish they were run democratically. Frequently game direction is a dictatorship by marketing or corporate types. Big business got more in control of video game production, and have turned it into a bland landscape.

  • Like 2
Link to comment
Share on other sites

Any team when it reaches a certain size needs to have intermediaries to make everything work. When that happens I think it starts to lose focus on the purpose of the end product - the mechanics of how something is done starts to use up resource, and if the focus isn't kept on than those tasks associated with the ultimate purpose , then with something as delicate as game balance that can have a negative impact.

 

To have a graphical specialist and sound specialist is good. A good level design can be beneficial too - so add in a developer or two to glue it all together and you probably have an optimal team size. I'm not sure it should be entirely democratic though. Building a game by committee results in too many compromises and an unfocussed product no-one actually believes in. Everyone needs to have input, but you need someone with an overall vision who makes the call - as long as everyone agrees who that is then there shouldn't be any problems.

 

Putting a deadline in can be problematic too as the creative process can be iterative - and forcing a deadline can compromise that. Of course you need to have some sort of target date otherwise nothing ever gets finished and out the door. Even when Atari were at their wackiest, if someone hadn't got something promising within 6 months or so, then there was a problem.

 

I don't think Nintendo have large teams for example, they certainly aren't a very large company which is the main reason why they are so good at making a profit. Their overheads just aren't the same as some of the larger corporations. Apparently Mario 64 was actually the result of scrapping the 1st version of the game. It had been changed so much during the creative process it was architecturally a mess. But importantly, they had got to the point where they knew exactly what the game should be like - so they were able to build the game again from scratch knowing that. But only someone with the gravitas of Miyomoto can get away with that. Management too often step in and force something out the door. 2600 Pacman is an example of that.

 

Now the big problem with 2600 Pacman is that it sold a bucketload - so management could justify their decision because of the short term gain (though I don't know it was ultimately profitable because of the large licence they paid) - but when consumers start to get stung with expensive low quality products then longer term you are on a losing strategy. Pacman in the UK was a £30 game - that's £90 adjusted for inflation.

Edited by davyK
  • Like 6
Link to comment
Share on other sites

That's right.

 

I'm not sure exactly when the 1 programmer = 1 game model went away. Maybe it was sometime around when franchises like Mario or Sonic were popular.

 

Maybe it was the 16-bit era? Anything more and the complexity of the machines demanded separate teams for sound, graphics, art, gameplay logic, and god knows what else.

When it became unsustainable. Even in the 8-bit computer era, it wasn't uncommon to see games developed by two, three or more people. But in the 16-bit era you could still find games developed by a single person. So it didn't change at once

  • Like 1
Link to comment
Share on other sites

And all the Apple Fanbois clamor over the castle walls to get it.

With respect to that, Apple has essentially two types of hardware customers:

 

- People who camp out in front of the Apple store for a week so that they can be the first to have the latest iPhone / iPad / iWhatever as soon as it goes on sale

 

- People who use Apple's computers in a professional capacity

 

I'm in the latter camp, and it's a much smaller one than the crack-addled mess that is the consumer electronics side of things.

 

Apple was very clever in reinventing themselves as something other than a computer company. But that's the problem: the traditional Apple customer, who needs a laptop or desktop machine, is a much smaller part of their business plan than the person buying music on iTunes - or the iPhone addict. That customer is almost a footnote in their business plan these days, close to being a token nod to the fact that they were once a company that just made computers.

 

As this relates to the Ataribox: from what we've seen so far, this is a device that appears to be being pitched largely on nostalgia to a type of buyer who will clamour to be amongst the first to own it. The problem is that while it's generally known in advance what an iPhone will do (because we've had a decade to become familiar with it and the smartphone market is essentially stagnant in terms of actual innovation right now), the lack of any real information surrounding the Ataribox (or the plans for it of the company behind it) makes it all hype and no substance at this time.

 

The thing about selling into this market is that, historically speaking, you'll see an initial sales rush, then... Crickets, or pretty close, once everyone who wanted one has one. My prediction is that if it does actually make it to market it'll follow the pattern established by other devices of this type: Christmas-ish release, strong sales, sudden tapering-off of sales, small trickle of sales over the remainder of its lifetime, EOL.

 

Granted, the same can be said for almost any consumer electronics device - and my prediction may even be wrong. But the only way I see that prediction being wrong is if it there is a killer software library offering something that you can't get on the Big Three platforms. Software is what sustains their hardware over the long term, and Atari will need to follow a similar pattern not to compete with them (which would be an unrealistic expectation at this time), but just to carve out its own niche. However, my faith in Atari's ability to deliver that software library is severely diminished when I see them going begging on Kickstarter just to get the hardware platform finished.

 

Coming back to Apple for a moment: there's nothing wrong with using commodity hardware. I'm typing this on an x64 Macbook Pro; sitting on my desk are a couple of Raspberry Pis and other devices all assembled from off-the-shelf components from other manufacturers. All of these devices have been specifically designed by their manufacturers to make the most appropriate use of those components for the intended application of the device - opening up the laptop won't reveal, say, an MSI motherboard that can be bought at the local Fry's Electronics - but even that MSI motherboard will itself contain commodity components, because reinventing the wheel when it comes to USB controllers and such other devices is stupid when they can be bought cheaply and in quantity.

 

Where this relates to the Ataribox is that it doesn't matter if it's built from commodity components. Realistically, for the vast majority of the end-user gaming world (and consumer electronics in general), the hardware in the device is totally irrelevant: it's mostly just fodder for endless ill-informed YouTube videos talking about why Console A is better than Console B because Console A has more Gigabeeyotches than Console B. The end user typically doesn't have a clue as to how any of this actually works, or understand that just because it's x64 on the inside comparisons with the Packard-Bell sitting on their desk are totally irrelevant because the hardware designs are fundamentally different due to the intended applications of the devices.

 

Those commodity components are going to make the Ataribox (assuming it makes it to market) more affordable and, possibly, more accessible to developers. These are things that it needs in order to gain market share; sustaining that market share is going to be the bigger concern over the long term. What hardware platform they choose to do this on is really not important, with the caveat that using something esoteric makes it harder to attract developer talent. x64 has a lot to recommend itself in this regard, and theoretically makes it possible for someone to write software for the device without having to resort to traditionally-expensive hardware devkits (though much of that will also depend on the underlying OS platform).

 

This appears to have turned into more of a rant than I expected it to, especially for a device that doesn't yet exist. For that, I'll give Atari full credit for at least getting people talking about it, even if there's no clear direction for the Ataribox yet. And as much as I'd like to see it succeed, looking at it realistically it currently appears to fall into the category of a nice idea with piss-poor execution from the start. Then again, who knows - maybe the hype will bring substance. But I'm not going to hold my breath on that happening; at this time there are too many up-front historical parallels with other devices that failed to make it to market to give it much more notice than to say, "that's nice, call me when they're on the shelves."

  • Like 4
Link to comment
Share on other sites

I suppose there's nothing wrong with big teams working on games, I just wish they were run democratically. Frequently game direction is a dictatorship by marketing or corporate types. Big business got more in control of video game production, and have turned it into a bland landscape.

That's what the indie scene is for.

 

Big AAA games can easily have budgets in the tens of millions of dollars. With that much money on the line, the investors/publishers are adverse to taking big risks

  • Like 1
Link to comment
Share on other sites

Yes, no headphone jack, no USB-A port (what if you want to use a Square reader with the new iPad, or charge and listen to headphones simultaneously?), because adding the ports in would break the contour lines of the device. That's form over function if there ever was such a thing.

 

I have no real interest in defending Apple. The only current Apple products I have are an iPhone, Apple Watch, and iPad Pro. While I agree that dropping the headphone jack is not a particularly consumer friendly move, there are some advantages to not having one, and certainly some high profile competition has followed suit. Certainly I'd prefer the headphone jack, but as a technology enthusiast (i.e., not an average consumer), I can't lie and say I don't like when technology sometimes ruthlessly marches on like that, even with the downsides.

 

As a recent example, I just replaced my perfectly good Plasma TV and sound bar setup in our family room with a completely unnecessary much larger 4K TV and surround sound system. Now I just want to watch native 4K HDR content and plain old HD content kind of disappoints me similar to how SD content disappointed me when I first transitioned to an HD display. It's a very different mentality some of us have, and I fully acknowledge that.

Link to comment
Share on other sites

I suppose there's nothing wrong with big teams working on games, I just wish they were run democratically. Frequently game direction is a dictatorship by marketing or corporate types. Big business got more in control of video game production, and have turned it into a bland landscape.

 

Possibly. The potential problem with a more democratic approach obviously is that the more democratic the process, the harder it is to make decisions and the potentially more "averaged out" the end result becomes since you're doing it by consensus. There's something to be said for a single person overseeing a project, keeping it on track, and having the primary vision. It's of course also up to that single person to also allow the rest of the team to be creative and provide input.

 

I also really don't buy the premise that videogames have turned into a bland landscape. We can't discount how massive indie gaming is - almost certainly the biggest its ever been and growing - and certainly even with the oft-maligned AAA titles, there are plenty of examples of innovative thinking. Everyone likes to pick on FPS games and sequels that always litter the top of the charts, but the reality is games along those lines (respective to their era) have always been the mainstays. The difference between now and then though is that the potential audience is no longer niche, so of course you're going to have things targeted to appeal to the most people, much like how blockbuster movies are. And just like with blockbuster movies, just because it's designed in that way, doesn't mean it still can't be a blast and ultimately just as worthwhile as something generally deemed "artistic."

  • Like 1
Link to comment
Share on other sites

I just saw the new pictures. It looks VERY attractive. However, I really hope that this console will be something more than a nostalgia machine. I want to see a new Tempest. I want to see a new Centipede! I imagine that It will have a store to buy the games. (like PS & Xbox) They will have a category for each Atari system, while also releasing new games on it.

 

I think that would be cool.

Edited by Atari PAC-MAN Fan
  • Like 1
Link to comment
Share on other sites

 

I have no real interest in defending Apple. The only current Apple products I have are an iPhone, Apple Watch, and iPad Pro. While I agree that dropping the headphone jack is not a particularly consumer friendly move, there are some advantages to not having one, and certainly some high profile competition has followed suit. Certainly I'd prefer the headphone jack, but as a technology enthusiast (i.e., not an average consumer), I can't lie and say I don't like when technology sometimes ruthlessly marches on like that, even with the downsides.

 

I'd tend to think the loss of the headset jack has nothing to do with technology marching on. More of an exertion of "form over function" just because.

 

What is the advantage of not having a headphone jack? And if there is one, does it outweigh having one?

  • Like 2
Link to comment
Share on other sites

 

I'd tend to think the loss of the headset jack has nothing to do with technology marching on. More of an exertion of "form over function" just because.

 

What is the advantage of not having a headphone jack? And if there is one, does it outweigh having one?

 

Waterproofing, better wired sound and other interfacing, etc. Again, I'd much rather they kept it, but can't argue with obsoleting certain technologies when I'm so ready to do it myself.

Link to comment
Share on other sites

 

Waterproofing, better wired sound and other interfacing, etc. Again, I'd much rather they kept it, but can't argue with obsoleting certain technologies when I'm so ready to do it myself.

 

Samsung disproved the Waterproofing one. Granted, I don't see the feature of being able to draw on your screen underwater as all that useful, but it worked, even with the stylus hole and headphone jack. Then again, the Note 7 probably worked better under water, in case it caught fire.

 

To be blunt, Apple killed off the headphone jack so they could sell their very own headphones, plus they can now insert DRM into the audio output.

  • Like 4
Link to comment
Share on other sites

To be blunt, Apple killed off the headphone jack so they could sell their very own headphones, plus they can now insert DRM into the audio output.

 

..and people love this sort thing. No doubt the latest headphone-jack-less model is outselling all others that came before.

Link to comment
Share on other sites

Ha, I love that audiophiles flipped their shit at the 'just buy bluetooth ones'. I think it was John Oliver that said "And Apple invents a new way to lose your headphones" when they came out with the little pod ones.

 

I really have no love for Apple, I do find it funny that a guy I worked with was a huge Apple fan, and kept saying that VR was just a niche thing and would fail like 3D TVs. Yet now Apple is working on some VR stuff.

  • Like 1
Link to comment
Share on other sites

I have always had a dream of having that Video Game System advertized in West German catalogues even in early 90s. Now I have one. With Harmony Cartridge. For years. What the hell are they trying to convince me to? A hardly a working of at least dozens of emulators of 2600? Are you serious? I have the REAL ATARI 2600, I have the REAL phosphorescent CRT TV (yes even your IPS / LCD display and even with "Advanced SUCKS display emulation" sucks compared to real old school CRT). I have the Harmony. What the hell are you doing here? The world needs to get rid of PHI/ALES. No matter if they are audio*phales or console ALES... anyway I can see the biggest dangers on those PVM phales... absolutely gorgeous if your read after them... Sony PVM... we have been waiting for you IDIOTS to come to save the 15 khZ world for us for sure... IDIOTS...

  • Like 1
Link to comment
Share on other sites

 

Probably when Imagic started paying artists to draw on grid paper so that the graphics would look better than when the programmers were just coloring in the grid paper with solid blocks?

Ditto. Not all programmers are graphic designers or composers.

 

Did David Crane also compose the soundtrack for Pitfall II or did he have help?

Link to comment
Share on other sites

As a Mac User who switched because I simply got tired of all the crap I had to do to keep my windows PC "safe" and "clean" just made it feel like a 33 Mhz PC running Win95, I am in X=usr's latter camp as well. I use it professionally and need something that "just works". My old MB Pro lasted 6 years and still works great, but I decided to get a nice new i7Quad Core.. I opted for last years model with the "normal" ports, and even has an HDMI so, no more dongles, unless I need ethernet, but I don't.

 

I think also that many folks here forget that cell phones used to come without a headphone jack, they all came with a crappy headset with a proprietary connector, that, required the removal of the power cord to use... Still have a few of those shitty things tied in knots in a drawer somewhere...

Hopefully the Atari Box will be cool. Hopefully it is real.

Link to comment
Share on other sites

I think also that many folks here forget that cell phones used to come without a headphone jack, they all came with a crappy headset with a proprietary connector, that, required the removal of the power cord to use... Still have a few of those shitty things tied in knots in a drawer somewhere..

 

And those phones didn't come with mp3 players and itunes and playlists either.. So no NEED for a conventional headphone jack.

  • Like 1
Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...