Sunday, November 24, 2019

Tesla Cybertuck, load, and wind-resistance. A lesson in energy.

This week, Tesla unveiled their new Cybertruck, a long anticipated all electric pickup truck.

Some are calling this an F-150 competitor, thinking this is a workhorse truck. However, I think this car is more of a bad-ass vanity truck, competitive with the F-150 Raptor, for guys and gals who want to feel like they're driving an awesome machine but don't actually do that much work with it. Why?

LOAD and WIND RESISTANCE.

Tesla claims a 500 mile range on the Cybertruck, but let me tell you two little stories.

I routinely make the 200 mile trip from San Francisco to Lake Tahoe, California -- In a Lincoln Navigator. We have a Tesla model S, but we never take it to Tahoe. Because the 75 mile uphill trip into the Sierra mountains, takes over 150 miles of range, putting the trip at the uncomfortable limit of range even for the highest capacity cars. Our friends with Teslas stop at the supercharger, every time.

Why? It turns out that Tesla's cars make their range as much by lowering consumption as by increasing capacity, and there just isn't any way to to make hauling a 5000 lb car uphill any more efficient. Load matters. 

However, that's only half the story.. A friend of mine tried to bring his Tesla S to Tahoe, with three bicycles on the roof. Wind resistance is another big part of efficiency, and we've heard all about how the Tesla S and X are designed to have minimum drag. Well, not so when you stick two bicycles on the roof. He could barely make it 100 miles on a charge, and couldn't even make it to the first supercharger. He had to stop at a 110v outlet, to add some extra juice, to get to the first supercharger. Then, realizing this wasn't going to work all the way to Tahoe, they somehow managed to squeeze three bicycles in the back of the model S with three adults inside. And after all these shenanigains, a second supercharger stop (for juice to make it up the sierra grade), and some traffic, it took them 10 hours to reach Tahoe.  Not fun for anyone. My Navigator drive was 3.5 hours non-stop.

What do pickup truck owners do? They drive around with open beds, that have massive drag, often with stuff inside, sometimes hauling trailers. It's good Tesla juiced the Supertruck up with 500 miles of range, because when the bed is open creating drag, you're going to get alot less range. If you pull a trailer, I'd be shocked to see it get 200 miles, and if you're dragging it all up the Sierra Grade, good luck.

I love my Tesla. I love Electrics. However, it's really important to understand that every tool is not right for every job.

The Tesla Supertruck has polarizing styling, and is sure to have amazing performance. I think it may play well in the vanity semi-luxury style truck market, such as consumers buying F-150 Raptors and Jeep Rubicons to do their suburban and city driving in. However, if you want a working truck to carry big hauls, and pull big loads even medium distances, I don't think an electric pickup truck is your cup of tea - yet.


ideas for battling FPS cheaters

A constant plague on First Person Shooter games is cheaters. In particular, ESP cheats, which allow players to see enemies through walls, and aimhacks, which allow players to auto-aim or auto-headshot enemies.

These cheats make these games very unpleasant to play, because over time a significant fraction, possibly as many as 30% of online players, end up cheating.

I've spent lots of time thinking about how to combat cheaters, and it's a very tough problem. Some have talked about game-streaming as a possible solution, but the latency and latency variability of game-streaming is currently horrible.

Another possibility is some kind of protected execution, but this would require not only support from the operating system, but also potentially support from the GPU, to prevent nefarious hacks from being injected into the process and/or GPU shaders.

Here are two ideas which could be deployed today, to fight FPS cheaters...
  • Aim-Captcha challenge - have the game challenge players with very good aim, with some kind of server-rendered video "aim test". if they can't score a decent level on it, then put the recent games into an aimhack review queue.
  • ESP "ghosts" - Have the game server inject a fairly decent number of "fake", but (mostly) fully occluded players. So ESP hackers are chasing ghosts.
Here is some more detail about each idea... They are published here to be free for all to use.

Aim-Captcha challenge - If a player registers above a certain percentage of headshots and/or hits, when shooting generally towards targets (ignoring random shots into nowhere), after the game is over, give them some kind of captcha-inspired aim-challenge, which is somehow not bottable.. Perhaps by sending a 360VR video, where they have to dispatch targets with reasonable precision. If they don't hit some reasonable precision target, then they're put into a queue for recent game replays to be evaluated for aim-hacking.

This is inspired by how https://play.typeracer.com (an online typing speed test) fights OCR cheaters. On that site, if you score above 100wpm, they give you another typing test which is a fully paragraph of captcha text, asking you to type that fast. This tries to assure a human is reading and doing the typing, isntead of an OCR bot. (I know it seems hard to believe someone would cheat an online typing test with OCR, but if there is a game online, people will cheat)


ESP "ghosts" -  Basically, the game could inject a bunch of noisy fake players, fully occluded by walls, that only ESP hackers would see. Effectively, this would make them chase ghosts. At worst, this would make their ESP unreliable, because they would never know what markers were real enemies, and what markers were fake enemies. At best, an algorithm might be able to determine when players are consistently chasing after fake occluded enemies, and automatically ban them.

Of course it's important that the fake enemies not confuse non-cheating players. So these fake enemies should not make sounds. The only sign of their existance should be the draw commands to the GPU to render a character, and in this respect they should be indistinguishable from real players. It's also important not to give them easily detectable patterns. For example, they shouldn't be in the same exact spots. They shouldn't be exactly stationary.

Perhaps a decent method would be to mark indoor locations that are invisible or nearly invisible from ranges of 10-1000m, and record real player behavior in these locations. In later games, that player behavior could be "replayed" as an ESP ghost.

In order for this technology to work, the game must have some kind of partial visibility from the server. Otherwise the "sudden appearance" of an ESP ghost would be a telltale sign it's not a real player. This could be simply range-based partial visibility, or it could be something more complex.


Wednesday, October 23, 2019

Google Apps Script for Advanced Gmail Filtering

TLDR - I just found and used Google Apps Script to write a really cool Gmail filter, that stars any thread (into a separate star-section) that is from someone that I've sent email to before. (i.e. senders I actually know)

Google Apps Script is an online typescript IDE that lets you script together apps at google.. It's kind of terribly clunky and buggy, but it also is pretty amazing that there is some Intellisense in a browser, and this thing is now seamlessly running in the cloud every 30 minutes.

Here are some instructions for setting up Google Apps Script. My filtering script is below..

Why did I do this?

I finally reached a threshold with all the non-spam notification mail cluttering up my inbox.

I already have multiple-inbox sections configured, with starred messages in a different section on top. I used to manually create filters to star messages, from my boss, or my wife.. but this doesn't really scale.



What I've always wanted was to automatically star threads from people in my contacts, or people I've sent email to before. However, gmail filters can't do this.

Enter Google Apps Script. It's a generalized cloud hosted Typescript programming environment, where you can bind together scripted actions on Google products. Some of it is a big laggy and clunky, a bit like dealing with the world's slowest compiler, and a debugger that is broken. However, now that the script is working, it's pretty neat that this is all magically hosted in the cloud and running every 30 minutes.

Here is my script:



function prioritizeInbox() {
    
  console.log("code start...");
  
  // var firstThread = GmailApp.getInboxThreads(0,1)[0];
  var firstThread = GmailApp.search("in:inbox and (label:starred)", 0, 1)[0];
  firstThread.markUnread();  
   
  // Wrap the entire function in a try / catch, in case there is an error, log it.
  try {
   
    // Get the most recent threads in your inbox
    var NUM_THREADS = 50;
    var threads = GmailApp.search("in:inbox and (!label:starred)", 0, NUM_THREADS);
    
    console.log("fetched threads ... " + threads.length);
    // For each thread
    for (var t=0; t<threads.length; t++) {
      var thread = threads[t];
      console.log({msg:"thread", n:t, thread:thread});
      
      // Get the sender of the thread
      var senders_map = {};
      var messages = thread.getMessages(); 
      for (var msg_n in messages) {          
        var msg = messages[msg_n];
        var msg_sender = extractEmail(msg.getFrom());
        senders_map[msg_sender] = 1;
      }
      
      var senders = Object.keys(senders_map);
      
      console.log("thread senders : " + senders);
      
      for (var n in senders) {
        var sender = senders[n];
        console.log("checking sender : " + sender);
        if (isImportantSender(sender)) {
          console.log("*****   Starring message from .. " + sender + "     ******");
          
          GmailApp.starMessage(messages[0]);       // star the first message in thread   
          break;
        }
      }       

      
    }
  } catch (e) {
    console.log("Error..." + e.toString());
  }
}


function isImportantSender(sender) {
      
  // check user email
  
  var user_email = Session.getActiveUser().getEmail();
  // console.log("checking user email ... " + user_email);
  if (sender == user_email) {
    return true;
  } 
  
  // check gmail aliases
  var aliases = GmailApp.getAliases();
  // console.log("aliases... " + aliases);
  for (var n in aliases) {
    // console.log("checking alias: " + aliases[n]);                     
    if (sender == aliases[n]) {
      return true;
    }          
  }
  
  if (false) {
    // check if I they are in my starred folder
    var search_spec = "label:starred from:" + sender;
    console.log("starred search_spec = " + search_spec);
    var starred_threads = GmailApp.search(search_spec,0,5);
    if (starred_threads.length > 4) {
      return true;
    } 
  }
    
  // check if i've sent them mail twice before
  var search_spec = "from:me to:" + sender;
  console.log("convo search_spec = " + search_spec);
  var chats_with_contact = GmailApp.search(search_spec,0,5);
  console.log("chats_with_contact .... " + chats_with_contact.length)
  if (chats_with_contact.length > 1) {
    return true;
  }   
  
  return false;
}

function extractEmail(addr) {    
  var regex  = new RegExp("^[^<> '\"]@[^<> \"']$");  
     
  if (regex.test(addr)) {
    return addr;
  }
    
  var regex2 = new RegExp("^.+<([^ ]+@.+)>$");
  if (regex2.test(addr)) {
    var match = regex2.exec(addr);
   
    return match[1];
  }
  
  return addr;
  
  
}

Sunday, September 29, 2019

DisplayFusion to clean up Windows 10 multi-monitor

Before I start.. this write-up was not in any way sponsored. I paid for a license for DisplayFusion because it's awesome, just like you should.

I run a two monitor desktop setup, and I recently decided that having my very wide 4k side-car monitor landscape oriented was silly. The far side of it was too far off center to be useful.  So I turned it vertically.. which seemed peculiar at first, but I'm now in love with it.

For reference, my primary monitor is an ASUS 27" 1440p 144hz G-Sync PG279Q, and my sidecar is an LG 31" Cinema 4k. Here is a picture of the setup.

image.png

The super tall virtual display has created some challenges. The first of which is that Windows 10  doesn't include any way to make desktop wallpaper look reasonable on a setup like this. It can span an image across the monitors, but it doesn't let you do anything to line the image up or shift it around. Another problem is that when windows decides one of the monitors is gone.. the stuff it does to the desktop icons is even more terrible, presumably because the resolution of the two monitors doesn't match up in any way.

So I went searching and found this cool tool.. Display Fusion, that fixes these problems and so much more. 

DisplayFusion allows me to span an image across monitors, and align it however you like on either monitor. So I was able to line up this cool background image to actually look like it spans the monitors.

image.png

However... it also does something *much more important*, which is to "prevent mouse cursor from snagging on unaligned monitor edges". This snagging occurs because the monitors are not the same height, and if you miss the "opening" the cursor gets stuck on the edge and is actually invisibly off screen. This is even more of a problem because my 4k vertical monitor is higher-DPI, because the opening is smaller on the 4k monitor than it appears in reality. DisplayFusion fixes this problem, by fudging the cursor onto the other monitor even if you are too low or too high for the connection area. 

It also has other features I never knew I needed like:
  • it'll force new windows to popup on the "current" monitor, and force child windows to open on the same monitor and centered around the parent window. 
  • It'll put the alt-tab overview up on the "current" monitor instead of the primary monitor.
  • and it'll save icon and window positions so I can restore them if windows butchers them

Sunday, May 26, 2019

The Chromebook Fixed-Function Pricing Conundrum

The Chromebook holds the promise of a simpler laptop that's $250-500. Even though they only run the Chrome browser and a clunky version of Android compatibility, for many users that's enough. 

Are Chromebooks something home users should consider? Maybe, if the device is under $300. 

Chromebooks are stuck in a Conundrum. They are built out of the same hardware components as laptops and tablets, so they end up costing the same amount as comparable general-purpose computers that run lots of software. But why would you pay the same price for a fixed-function computer, losing the ability to run arbitrary desktop software? 

I picked up a refurbished HP x360 14" Chromebook for $250. At this price, it's a great device. It's simple, it has good battery life. I don't have to think about software problems or upgrades. It just works. It's like a simple web-browser appliance. However, If I had to buy this machine for the full-retail of $520-600 on Amazon, I would not be very happy with it. Especially when I can buy the 2019 version of the same x360 hardware for $500 running full fledged windows10

This comparison gets even worse when looking at Google's Flagship Chromebook Pixel that's a full $1000. For that amount of moolah, you can buy a non-retina Macbook Air or dozens of different excellent Windows 10 laptops. Those are full fledged computers that can run actual real desktop software. These expensive Chromebooks remind me of early "fixed function word processors" which were destroyed by the introduction of general purpose personal computers. 

I'll admit, there are some situations where the simplicity of the Chromebook might be worth the loss of functionality. For a less savvy computer user, a Chromebook is much less likely to get a Virus, get corrupted or broken, or require software repair. Yes, you can install Android apps, but these are pretty bizarre and clunky to use with a mouse-keyboard, because they are designed for finger touch mobile devices. 

However, is the simplicity of ChromeOS really worth giving up all native desktop applications if they are the same price? My answer to this question is a resounding N.O.  

Which means, if you're going to dig into Chromebooks, look for lower priced, refurbished, or used models. 

The  $260 Acer Chromebook 11 CB311-8H-C5DV  has USB-C charging, a decent screen, and is a good value, especially for kids. 

However there are far more cheap chromebooks you should stay away from. the $250 ASUS Chromebook C523NA-DH02 15.6" is an absolutely horrific display that causes unreasonable eyestrain. The ASUS C100PA-DB02 Flip 10" and Acer C720 11" have reasonable screens but they don't charge via USB-C, which means keeping track of annoying propritary chargers. 


NOTE: It is entirely possible for a simpler ChromeOS like system to allow native desktop applications, much like it allows native Android applications. ChromeOS just chooses not to. 



Wednesday, May 1, 2019

Chromebooks - hobbled by lack of keymap configuration

For years I've been watching the Chromebook trend, waiting for it to deliver on the promise of lower priced simpler computers, but unwilling to use these budget slow 10-11" toys. Well, I recently found a piece of Chromebook hardware I actually like. The HP x360 Chromebook has USB-C charging, a 14" screen, and a decent enough Intel processor to feel like a real laptop (at least in chrome).

Update:  I've been able to mitigate these issues, by swapping the Control and Alt keys in Chromebook configuration. As a side effect, this produces PC-like cut-copy-paste physical motion, instead of Mac-like. That's fine for me, but YMMV. 

However, beyond trivial browsing, I find it virtually unusable, because I keep instinctually hitting alt-left to go to beginning of line (because that's how it is on the Mac), and it keeps doing browser-back and eating my typing into web pages. It's happened three times tonight already.

I want to throw this thing into the wall so hard it will break into a zillion pieces so it will never steal any of my work ever again. The only thing stopping me is the motivation to write this post so I can make sure it never eats any of your work.

Apparently google has decided to map browser-back to alt-left and beginning-of-line to search-left, and there is no way to remap them, because, well, i have no idea. People hack around Chromebook keymap issues with custom keyboards and extensions, but it's not possible to remap this, because it's not a key, it's a key-sequence that chrome interprets. I could swap the alt and search keys entirely, but then anytime i pressed alt I would get search, which seems even worse.

The irony is.. I don't want a browser-back shortcut at all. I always disable it on windows and mac laptops. I can handle actually clicking on the browser back button when I want to use it.

If you are a fast typist, and you are used to using Command-Left on the Mac, consider seriously whether you will be able to control yourself before you buy a chromebook.


Monday, April 1, 2019

Windows Ink screws up Wacom tablets?

I just bought and installed a Wacom Intuos Pro (medium) to see what kind of experience drawing with a pen on a computer provides.

Installation was simple enough, but as soon as I had everything working, I quickly realized I couldn't click and drag with the pen -- at all. I couldn't drag windows around, I couldn't drag inside a webpage. Instead everytime I tried to drag, it was performing a grab / scroll operation.

It took me thirty minutes of digging to figure out that this was a "Windows Ink" feature / bug. I spent some time unsuccessfully trying to completely turn off Windows Ink, but the tutorials that told me to use the registry or mess with group policy settings just didn't work.

Finally I found a youtube video explaining the problem, and directing me to the hidden spot in the Wacom control panel where you can turn off Windows Ink.

Here it is for my Intuos Pro... However, if you want to use Photoshop, or other apps that rely on Windows Ink for pen pressure, you'll need to watch the video and learn how to tell photoshop to fallback to the legacy pen-pressure APIs.


Why would Windows ship a pen Ink feature which screws up Wacom, the longest running and leading manufacture of pen input devices?

Why would Wacom hide this setting to turn off Windows Ink deep in the bowels of their control panel? Especially when it's now essential to making Wacom tablets function at all.

Mysteries.

Tuesday, March 19, 2019

Google preparing to be the latest failure in streaming video games....

On March 19th, 2019, Google gave a GDC Keynote to announce Stadia, their game-streaming platform.

Stadia will turn out to be the industries eighth major attempt at cloud game streaming, the previous seven either having failed or been put on indefinite hold.

Stadia imagines some user that cares enough about gaming that the want something better than low-end GPUs available in inexpensive devices, but doesn't care about video compression artifacts,  the variable latency and reliability of the Internet, and on-going subscription fees. Do these users exist?

Let's look at some trends

While the Stadia presentation talks very generally about gaming, it seems like a console competitor. Mobile gaming is growing like gangbusters on cheap smartphone GPUs, and doesn't seem to need any help from the cloud. Not to mention that streaming games over a cellular connection doesn't seem viable, at least not for very long. PC Gaming is defined by mouse and keyboard control, and Stadia isn't doing that, they have a console controller.

Industry trends say mobile gaming is huge and growing, PC gaming is smaller and growing, and Console gaming has been in decline for a decade, and while 2017 was a comeback for console units shipped, the bulk of that comeback was Nintendo Switch, which is arguably a mobile gaming device.

So Stadia seems positioned to take over the lagging console market, maybe even revitalize it?

Let's talk about Latency

One of the biggest issues for cloud game streaming is Internet latency... Realistic internet round-trip latencies of 30-150ms are a nusiance for client-server games but make game streaming services somewhere between unpleasant and unviable.

Yet Google rambled on for over an hour about Stadia without ever mentioning latency. No round-trip latency goals, no measured results during their test, no strategies for improving it with amazing connectivity. The only mention of latency in the entire presentation was a short reference one user made about Project Stream having "no percieved input lag". Highly scientific and confidence inspiring. Instead Google talked on-and-on about framerates, visual quality, and resolution -- including not only 4k, but 8k!

In the real world, passionate gamers (the one who spend lots of money on gaming software and hardware) have long accepted that better visual quality beyond a point is inferior to better response latency. In the real world, gaming luminaries like John Carmack rant about how the importance of tackling latency, back in 2013 saying "about the only thing that's not going well in displays is latency"... "in our limited bandwidth budget, i'm afraid it's going to come down to doubling resolution, wheras doubling the resolution or doubling the frame-rate, at this point I'd take doubling the frame-rate." He was talking about moving to 120hz here.. and fortunately, in PC gaming, we've gotten what we actually needed, which is G-sync and Freesync, and more and more gamers are playing on 1080p displays at 144hz, with less than 6% of Steam users playing above 1080p according to the Steam Hardware Survey.

With 144hz displays and g-sync, PC gamers are approaching 15-35ms latency between their fingers moving and their eyes perceiving the changes. What kind of latency can we expect from Stadia? Some PC Gamer tests suggest we're looking at 160ms. That's 5 to 10 times worse than PC gaming latency.

As a PC gamer, after watching reviews of Geforce Now, I wouldn't consider a game streaming service unless it had an end-to-end low-latency guarantee where I don't have to pay if the latency doesn't meet an expected target, and an exception to any data-caps with my provider - where they don't count constantly streaming game data. And even then I'm unlikely to be impressed, because I'm used to gaming on 144hz G-Sync with 15-35ms latency. Oh, and I don't use controllers. I only want to play mouse-keyboard. In the end, I don't think PC gamers are the Stadia target market.

Stadia wasn't that far off the console latency though, so maybe Stadia could be an end-run on the console market... The shrinking console market. If it wasn't for latency variability. Any good looking latency test of Stadia is only a best-case result. Anything between you and Google servers could add response latency at any time, including the family member who just booted up Netflix in another room of your house.

And lets talk about Cost

Another thing Google's Stadia announcement ignored is cost.

Fans of Cloud gaming like to point out the gaming rigs, whether a $300 console, or a $2000 PC, cost a good chunk of change, but cloud gaming can let you just click and play on virtually any computing device. While this is true, there *is* a cost to letting you use that GPU in the cloud.

In 2019, we have huge headlines about Fortnite, and how Free-to-play games are becoming massive franchises.. including League of Legends, Waframe, World of Tanks, PUBG Mobile, and EA's APEX Legends. Some might say we're on the eve of a free-to-play gaming revolution, where the most popular, most played, most watched, and most monetized games are all free to play.

Which begs the question, How do Stadia GPUs fit into a free-to-play world? Will Google be subsidizing F2P gaming on Stadia GPUs and taking a backend cut of gaming revenue? Will Google make F2P developers pay Stadia fees, subsidizing their free users in a gamble to unlock their paid user revenue? Or will Google try to make users pay usage fees directly to Stadia, positioned as an alternative to the cost of buying dedicated gaming hardware? All the promise of "play now" in their Stadia launch announcement suggests they won't do the latter, but someone has to pay for these GPUs. Who is it?

Unique solutions to problems

Stadia does has a couple interesting ideas.. In particular their quick and under emphasized launch of a Stadia controller that talks directly to the cloud over wifi -- cutting out latency introduced by sending controller inputs through a local computing device (which I estimate at 10-50ms). And the legitimate and seriously cool benefit that Stadia prevents client-side cheats that plague big free-to-play titles like Fortnite.

However, they also made some serious gaffs. Like claiming that Stadia mutiplayer is going to be better than traditional lagged Internet multiplayer. When client-server games lag by 50-150ms, it creates occasional discontinuities in game events.. If Stadia is lagged by 50-150ms, it will be unplayable. So what are they going to do? Bring a Google/Stadia fiber to every house? Give me a fancy quality-of-service router to make sure other users on my home network don't impact gaming? I don't understand how they are going to magically make IP latency variability go away.

In the end, Google announced their new Stadia gaming platform in a fashion that thematically resembled Apple's big keynotes, but it was sadly missing the most important part.. The traditionally  Jobsian move of staring into the eyes of the beast and telling us how they conquered it.

We want to know how Google solved the latency and cost issues that have plagued cloud gaming thusfar.

-----

If Google was a bit more in touch with the industry, this presentation could have been... "Everyone knows we've been experimenting with game-streaming. What they don't know, is that we realize Internet streaming latency means that game-streaming is the on-ramp for our new gaming platform... Where players can buy, share, and play games on the best hardware and software they can afford, whether that hardware is located in the cloud, or under their desk."

What this presentation was... "We've built the latest and greatest game-streaming platform on the planet. Despite seven other game streaming platforms failing because of latency and cost issues, we're not going to talk about latency and cost. We're just going to stand up here and tell you we're all on the eve of the cloud gaming revolution. Because like Alice trying to make it home from Wonderland, if we say it enough times, maybe that'll make it so."

Good luck Google.

...I'm going back to playing APEX Legends, the latest free-to-play PC battle royale, on my local-computing gaming rig at 1440p 144hz G-sync on a 4ms LCD panel, with incredibly low (sub 50ms?) finger-to-eye response latency.



Tuesday, March 12, 2019

PS4 Pro is a waste of money (aka, why PC Gaming is not dead)

I recently got into playing APEX Legends on the PC, which is a fantastic game... I'm a first-person player, and I didn't like Fortnite's third-person or it's cheezy crafting system, so APEX is a really refreshing battle royale to pull me away from playing PUBG.

Several weeks after the game release, we're seeing more and more "suspiciously good players". Not all of them are cheaters, but seeing as they just banned 350k cheaters, I'm sure some of them are. A friend of mine pointed out that cheaters are much less common on console... which got me curious.

What is the gaming experience on consoles like these days? Are there fewer cheaters? What is the rendering like? I couldn't aim at the broad side of a barn with a console controller, but then I found this XIM APEX device which secretly maps your mouse and keyboard to controller inputs, so your game doesn't even know you're using mouse keyboard (which means they can't stop you either).

So I tossed some money to the wind, and picked up a PS4 Pro and a XIM APEX to find out.

I went the PS4 route, because you can play PS4 F2P online games like APEX Legends and Fortnite without a playstation plus subscription, but on XBone you have to pay Microsoft a monthly fee just to play a free to play game. I hardly ever play consoles, so that seems really dumb, but it's not strictly the cost that bothers me, it's my morale objection to being taken for a ride by the monopoly.

Back to the PS4, what did I learn from all this?

Compared to a gaming PC, a PS4 Pro is a waste of money, and PS4 Pro plus XIM APEX is an even bigger waste of money. 

If you really prefer to game on your living room couch, with a controller, 4 to 8 feet from your big-screen TV, maybe a PS4 slim is worth $300 to you, but if you want to play games that need better gaming horsepower, and you're going to play them 24" from a monitor at a desk, especially with mouse-keyboard, then have some self-respect and get a gaming PC instead.

NOTE: Some people are confused by the claim of 4k gaming on a PS4 Pro. The PS4 Pro never renders above 1080p, and sometimes it renders at even lower resolutions. It puts out 4k by using blurry upscaling techniques, which are aptly termed "faker 4k". I can plug any 1080p signal into a 4k monitor and get "faker 4k" so please just ignore this marketing garbage. Here we're only concerned with pixels actually rendered by the 3d GPU, at 1080p.

GTX 1060 Gaming PC trounces PS4 Pro

I think this simple cost comparison explains it all...

PS4 Pro + XIM APEX + 1 year of Playstation Plus = $645.
GTX 1060 Gaming PC (~30% faster and 10x better than PS4 Pro) = $750.

Of course the PS4 Pro itself only costs $400, so you can get started for cheaper. However, as you try to crawl out of the hole you've put yourself in with that lower initial cost, you will spend more and more and in the end you still won't have something that holds a candle to a gaming PC.

It's hard to find side-by-side performance comparisons of these two setups, because it's a bit of apples-to-oranges. However, here are some numbers:

PS4 Pro - 8GB shared VRAM
GTX1060PC - 8GB system RAM, 6GB VRAM




That GTX 1060 is notably faster than the PS4 Pro, both in GPU and CPU. Plus, the PS4 Pro has 8GB memory total, while the Gaming PC has 8GB of system RAM and 6GB of graphics VRAM.

What is XIM APEX?



What XIM APEX does is operate as a "man in the middle" for USB connected console controllers, pretending to be your controller in a way that the console doesn't know is happening. My limited experience with it so far is that it works pretty well, but I did have some weird issues not registering simultaneous key-presses on my USB connected keyboard. I'm not yet sure if it's an issue with XIM, the keyboard itself, or what the console will accept. I'm going to do some testing.

The Good:

The XIM APEX smartphone configuration UI, especially in advanced mode, is very powerful. It lets you map mouse/keyboard to controller actions, and you can make several different mode pages per game. These can either be operated by the in-game keybinds, or custom toggle/hold mode keys. For example, when I hold my in-game (ping) key to bring up the APEX Legends ping spin-menu, my XIM APEX config automatically switches to a different mode, which has a custom mouse-response curve, to make operating the spin menu feel "natural" on the mouse, as well as bind the mouse buttons to operate menu confirm/cancel. It's pretty amazing that this just works seamlessly, whenever I hold my ping key.

Their advanced configuration UI is much more confusing than it could be, and means they can't ship "default" configs which use all these fancy features. For example, when I have a game-button related config, such as making the mouse work differently when I hold the ping hotkey, XIM doesn't know the two are connected. One is a button mapping, the other is just a custom mode page.. If XIM knew that the custom mode page was related to the button mapping, then it could ship with that custom page already setup in the default config, and whatever I changed the button mapping to would automatically activate that mode page.  This isn't a huge deal, it just makes the out-of-the-box experience more complicated than it could be.

The bad:

The big problem is, why are we using XIM APEX to begin with? Using mouse and keyboard requires a desk, which begs to ask, why are you not just putting this money into a PC gaming setup instead?

$400 for a PS4 Pro, $125 for a XIM APEX, and unless you're only playing F2P games like Fortnite and APEX Legends, you get to pay $120 a year in idiotic console online multiplayer fees on top of that... and for what? To get super-pixelated 1080p on a device that can't run a real web browser, can't run PC games, can't alt-tab during the game, and where the translated mouse feels like a sluggish mess compared to the PC version.

The PS4 Pro GPU is said to be a bit slower than a GTX 1060, with a CPU that is underpowered compared to a PC eqivalent. Two minutes of searching turned up a $750 GTX 1060 gaming PC at best buy, and I bet with a bit more searching I could do better. This PC will have at least 30% better gaming performance, have none of the weird XIM APEX mouse-to-controller translation issues, and be massively more usable than a console.

So who is this consumer that wants XIM APEX?

A kid who's going to borrow the family console into their bedroom and try to "get more serious" with mouse keyboard? A streamer being paid to promote console titles and wants every advantage? Someone who really wants to play console exclusive titles, but refuses to pick up a controller?