Thursday, August 11, 2011

As you can probably guess from the title this one's going to be about Apple, and why I don't get it.  What is it by the way, for me "it" is the somehow miraculous climb to pop culture phenom they've become, and why, at least to me it make no sense.

Let me be fair here, I will say that I'm not biased against Apple because they are "popular" I'm not one of those people who hates something because everyone else loves it.  I simply can't see anything special about Apple's offerings.

Let me air out my laundry list of complaints.

  1. Proprietary Boot Chip - You can't install OSX on a non-Apple product without a ton of work thus the name for such a beast "Hackintosh".
  2. Numerous law suits by Apple over technology they stole then patented. - Yes this is a very true statement, Multi-touch comes to mind quite readily. 
  3. Rigid control 3rd party development and compatibility.
  4. The extreme lengths Apple will go to in order to maintain appearances that aren't true.  Mac-defender anyone.. read about it they instructed their tech support to deny it's existence.
  5. Sub-par Configurations of their hardware yet they charge premium prices. 
Now to explain these, and why I feel it makes Apple's computing systems not worth the $$$.


1. Proprietary Boot Chip.  What's the deal, they have the boot record not as a section on the Hard drive, but on a separate chip.  This limits me to Apple hardware unless I want to build a "Hackintosh" which while doable is a total pain to set up by comparison.  I also like the option of not having to rely on one manufacturers hardware.  I can do my own custom build to suit my needs not what HP or Dell decides is "Best for me".


2.  Numerous Law Suits over Apple Stolen then Patented Technology.  What are the big "perks" for Apple owners?  The App Store, Apple didn't have it first numerous Linux users had them well before Apple, the Dock, nope borrowed from Arthur OS, ahh Multi Touch for the iPhone!  wait.. Apple patented that in 2009 yet with the power of the internet we can find it being demonstrated at a TED convention in 2006, and Microsoft had their "Milan" computer which supported Multi-touch demonstrated in 2007 and admitted it had been 5 years in the works.  Doesn't look to me like Apple was creating this technology, yet they claimed patents on it and have sued Motorola over it, but you don't hear about all the suits against Apple for it's patents.


3. Rigid Control over 3rd party Development and Compatibility.  Yes I can hear it now "But that helps prevent virus' from infecting Apple products!".  Guess what skippy, no it doesn't the only thing that can prevent virus' intrusions is the end user.  If you get get a virus on your computer it is 100% your fault for not having a good anti-virus program, and not practicing safe browsing habits.  The truth is the only thing keeping Apple products from being plagued by malicious software is market share.  Virus' writers want to do the most damage with the least amount of code, that's the entire philosophy so why target around 10% of the market when Windows has around 88%.  Truth be told Mac OSX is one of the least secure operating systems on the market, coming in last place in several security competitions. Last one I read up on, a remote "hacker" took control of a Apple Macbook in under 3 minutes, using the same method he used to win the competition as he did the year before.  Apple hadn't patched it even though they knew of it.


4. Apples desperate desire to save face.  Mac-Defender put them in the spotlight, it showed that OSX was as virus proof as they tried to convince everyone.  What does Apple do?  Instruct their tech support not to assist customers with the issue and to deny it existed.  Seriously is that what you expect for paying the prices Apple wants for their hardware?


5.  Sub-par Configurations.  Ok you've got two choices in laptops laid in front of you.  Both have equal RAM and Processor Speeds, the same Video technology, the same size screen etc.  One is a Macbook the other a Laptop from Asus.  The Asus laptop comes with a Blueray player that doubles as a DVD/CD reader/writer, a larger hard drive, and Windows.   The Macbook, it has OSX, Thunderbolt, and a $1000 higher price tag.   Seriously $2500 for a laptop that doesn't even play DVD's?  Oh but it's got Thunderbolt a technology no one else has really adopted yet, making it practically useless for the end user.  Where does Apple get off charging these prices for their little Apple logo on the box?  There equipment configurations are sub-par at best and modifying their equipment is almost impossible so what gives?  And how come people will continually pay twice the price for half the product?


I'm sorry but for me Apple's only good product is the iPod and not the fancy touch screen one's.  I'll stick to my iPod Nano thank you very much.  For raw computing I'll stick with Linux and Windows.  Windows for my gaming and Linux for just about everything else, I get the best of both worlds this way.  Linux provides me with a more secure Operating System, that's rarely the target of the script kiddies pumping out virus' (and yes I have a anti-virus program even for it), and Windows allows me to enjoy all my favorite games on their native platform. 

Wednesday, August 10, 2011

I've said this before, but where's the new in new games?  It seems Hollywood isn't the only group running out of idea, we are seeing a constant rehashing of the same old same old in the gaming industry.   This isn't a new trend, series have been around for ages, Final Fantasy, Mortal Kombat, Street Fighter etc.  But there's another side to it.  EQ, EQII, WoW, LotRO, Rift, and countless other games are consistently rehashing the same old level and grind game model.  When are we going to see something new in terms of game play.  Is it that the technology isn't available or are companies trying to milk what's been a popular platform for everything it's worth.  I don't have the answer but I do have the desire to see something new.  I want to see more dynamic game play, a situation where one person can effect the game world.  There are tons of ways this could be done within the existing style of games people are playing.  Have a chance of you killing a raid boss trigger a counter attack by his minions on a town, make contagious diseases actually happen on purpose instead of as bugs.  Heck start the zombie apocalypse, speaking of zombie apocalypse that reminds me of something.  Many years ago EA introduced a new dungeon to Ultima Online called Kaldhun, one of the unique features of this dungeon was that if someone PK'ed another player a reverent was spawned that could only die when the person who did the PKing was killed.  Why can't we see a game that does something like this in the PvP areas?

It's an interesting idea for games like LotRO, where the PvMP area has spawn in addition to Monster Players and Regular characters patrolling the PvMP zone.  Wipe all the regular spawn and replace it with revenants, it would definitely make things more interesting.  Even in UO where the idea originated, wipe all spawn off the Felucca Facet (the only PvP area in the game) and have it slowly be populated by revenants.  The higher the population of revenants the obviously better certain players are at PvP.  They could even be coded to hold the same notoriety as the player they were based off of.  When a Blue or Innocent dies, a Blue Revenant spawns, it would only attack opposed notorioty players.   Eventually it would make PvP more difficult for the "Evil" people but it could also work against the PK Hunters, as evil notoriety revenants could be spawned as well.  This is the type of thing that would make playing a game more interesting if you ask me... what are your thoughts?

Monday, August 8, 2011

Technology for the PC Gamer, what really defined technology for the PC Gamer, is it a cutting edge graphics card, a fast multi-core CPU, tons of RAM?   What really makes up a "Gaming Rig", I don't have an answer.  The problem is technology is moving faster than game developers can keep up, this whole faster is better idea has caught hold with the tech industry, where at one point improvements came out as software pushed the limits of hardware.  Do you really need an overclocked Hex-core CPU with 32 GB of RAM to play WoW, and when is enough, enough or are we just waiting til the tech industry hits a wall where we simply haven't invented the technology to make the technology faster. 

Moore's Law states:

The number of transistors that can be placed inexpensively will double approximately every 2 years.

This trend has held true since the 1960's, and continues today, while Hex-Core (6-Core) processors finally reached affordability on the market in the past year, with AMD's offerings available for as little as $160 or so, they had been around quite a while, as a matter of fact Dodeca-Core processors were being reviewed back in 2008, and for if you really want to dig you can find refereces to AsAP 2, a custom built processor currently residing somewhere on the University of California, Davis campus that has a whopping 167 cores, each running at 1.2ghz, and that information was released in 2009, so in this day and age 2011 does that mean that 334-core units are either being worked on or are already in testing somewhere?

While this is amazing in and of itself to think about, why do we as gamers need even Hex-core chips?  With the vast spectrum of PC's on the market it's difficult enough for game designers to decide if they want to optimize their software for Dual Core or not, try sitting down at a design meeting and deciding if you want to create a game that can't run efficiently on less than a Quad-Core, from what I've seen they usually attempt to make the game run well on as broad a spectrum as possible.

Graphics cards these days seem to be leading the way though, Deca-Core GPU's aren't uncommon now, and their memory architecture is allows for better data rates, though both are optimized for Graphics only, but we see other system components catching up a few years later.  DDR3 which has become the new standard for system memory used to only be found on Graphics Cards, how long before we are slapping DDR5 into slots on our Mother Boards and cranking up to 100+ GB of RAM for under $100?  I'm looking forward to it, to be quite honest. 


While I don't think we necessarily as gamers need cutting edge technology, to indulge in our hobbies, it's not a bad thing either.  One thing we can count on is when that next big game hits we'll be ready for it.   I also do think that all game developer are doing the right thing by trying to keep system requirements fairly low as as a general rule.  I know I'm using a rather low end machine for my gaming (by today's standards), and yes at times it shows during my game play.  But I can't quite justify wanting cutting edge hardware either especially since there is very little in terms of backwards compatibility with many components from where I'm at to where I'd like to be.   For me to do a real upgrade to get close to "Cutting Edge" wouldn't be extremely expensive if I cut a few corners and leave out "Optional" component's like Blue-Ray etc, as a matter of fact I could build a more than capable PC for about half of what it costs to purchase a comparable one from a Custom PC builder but that doesn't mean I should.
Subscribe to RSS Feed Follow me on Twitter!