unFocus Projects

Category: Uncategorized

  • Update from the future!

    I resurrected this blog! I intend to make some updates here eventually, hopefully not a false start. This blog has been dormant, and barely even functioning for years. In that time, I did post a bit on medium. Those posts are mostly political commentary in nature, as I watch my civilization implode, and I’ve back ported them here. But I’m actually looking to pivot away from that a little bit, so we’ll see if I keep them. I want to get back to my technical roots, and fun – I have updates coming for Quint Dice, and some other thoughts about technology (AI!) The intention is to use unfocus.com as central location for a bunch of things, on medium (maybe), youtube, and other stuff. The focus will be on tech, with maybe an occasional political rant. Longer form thoughts coming soon.

  • Windows Color Management, a Rant


    Windows Color Management, a Rant

    Microsoft is uniquely unable to solve a problem Apple solved ages ago.

    Photo by Fotis Fotopoulos

    I just got a new monitor, and it supports 98% of the P3 color space, a high gamut screen. I had a high gamut screen some 10 years ago. Just as 10 years ago, Windows is still unable to properly support this screen. The result is that on Windows, all colors are over saturated for everything.

    Now, the monitor maker has some blame here. The associated profile seems to resolve in the sRGB color space, and Windows is applying that. But the problem is, there’s no easy or obvious way to simply select a better color profile (like a generic P3 color profile). And, even if I do go to the ancient, incomprehensible ICC screens (I’d bet $100 the developers who built that thing don’t even know how it works), it doesn’t actually apply to most apps! The app has to be aware of color management and tag the content appropriately. But even apps that are aware, like Photoshop, only tag the content you might be working on?—?so their own UIs are often over saturated. Even color aware apps on Windows aren’t properly color managed…

    The problem is breathtakingly easy to describe, by simply describing the way it works on macOS. In macOS, I can simply choose a different color profile, including one that I produce myself with a color calibrator (Spyder or similar), and the result is that all the windows that aren’t color aware, are adjust to look not over saturated. It’s like magic! It’s also easy to describe why—macOS simply treats the unmanaged stuff as if it were sRGB, and adjusts it accordingly.

    So why is this so hard for Windows to photocopy? All I want to do is apply a more appropriate color profile, and not have red icons stab me in the eye. It already works on macOS. So what’s the problem? Get out your photocopiers, Redmond, and make it work.

    Maybe if this stuff worked right, we wouldn’t need “sRGB” mode, like this new monitor supports. What a crock?—?it should be called “Windows backwards compatible mode” because that’s what it is. Support for a hopeless platform. Maybe if this worked right, the associated profile could accurately reflect the capabilities of the monitor, instead of assuming broken, weak support in Windows, and using an insufficient sRGB profile.

    And don’t get me started on HDR…

  • Facebook is the Last Mile of Media Publishing, and Should Act Like it


    Facebook is the Last Mile of Media Publishing, and Should Act Like it

    Mark Zuckerberg is a smart guy, and from most accounts, tries to do the right thing with his influential platform. Given the newness of social media, it must be challenging to figure out what that right thing is?—?especially when billions of dollars and the demands of share holders is on the line.

    Recently Mr. Zuckerberg was quoted saying that he doesn’t think its Facebook’s job to fact check political speeches. I think he’s probably right, but only in one specific way. Facebook is not in the news media production business?—?they don’t write the news. But they are an intermediary (the root of the word “media”) just he same. Conceptually, they are at least as influential as any publisher, in choosing which content to show users. Practically, given current trends, they are more influential than any news outlet’s front page?—?or all of them combined. I’ve written previously about an important (easy to fix) mistake in their curation algorithm which leads to two information silos modeled largely on a reductionist idea of America’s 2 parties— left vs. right, Democratic Party vs. Republican Party. That’s one half of the problem, and they should address it. But they also have some responsibility to promote fact based reporting, and thwart misinformation, just like any other media publisher. They don’t write content, but they certainly do promote it, selectively, to each and every user.

    Fact based news coverage is not sexy, but is essential for functioning democracy, and it requires credible, authoritative, expert validation. By many accounts, people “like” negative, bias confirming, often factually inaccurate posts more often than they “like” positive, moderate and factual posts. This is not a problem created by social media. Over the last 40 years fact reporting and investigative journalism has declined, as bombastic opinion writing and increasingly partisan infotainment and editorial has taken up the slack, in an attempt to capture both political influence and profit. It turns out, people are more motivated by in-group dynamics than reason, and catering to that is more profitable than not?—?for all media. It’s in this context that social media has the power to either enhance, or blunt the power of misinformation. Facebook’s current curation model, and decision to keep driving forward, enhances the problem, rather than curtailing it. Doing nothing is not doing nothing in Facebook’s case.

    Facebook, whether Mr. Zuckerberg likes it or not, has embedded itself in the last mile of the publishing pipeline for much of the news media citizens consume. Facebook should do the work to balance their own profit needs, vs. the impact they have on their users’ perception of the facts. Social media is media.

  • FaceBook’s Political Algorithm Error and Tribal Extremism


    Facebook’s Political Algorithm and Extremism Silos

    The social media echo chamber problem is strengthened by a simple algorithm mistake which Facebook and seemingly all of the social media platforms have baked into their cores. The Wall Street Journal recently shed light on part of the problem —Facebook chases “engagement,” by feeding folks more and more “sticky” extreme and enraging content, to keep them glued to the Facebook platform?—?and Facebook’s ads longer. Facebook categorically can’t fix this problem, because it’s at the core of their entire business model?—?which is likely why they chose to do nothing.

    But there is a second side of the problem, and I’d argue a more important part— the “silo” problem. Facebook (and other social media) ranks every post on a political spectrum from left to right on a 5 step scale. They rank each user that way too. You can see how they rank you in your profile’s “Ad Preferences,” if you’re curious. They use these values to feed us posts that fall within those parameters. Before social media, polls showed folks were often had strong opinions on single issues. Often they held points of view that didn’t align completely with a party. One might be anti-abortion, yet also anti-gun for example (even if only one or the other motivated them to the polls?—?so-called single issue voters).

    That happens less and less now, and I believe social media plays a role in this siloing effect, because these platforms are essentially sorting issue propaganda in two consistent “silos” on a simple 5 step spectrum. This drives partisan polarization on groups of issues together, rather than on single issues. Where we used to have a set of different single issue groups, that parties sort of gather in to a single coalition, now there are two large groups of people, who all think and understand an entire set of issues the same way?—?often with their own whole vocabulary to describe their set.

    Together with the propensity to promote partisan extremism on social media platforms, we have a dangerous mix. This is society breaking, and is rife for foreign and domestic actors to manipulate and exploit. Perhaps unlike the business model of promoting extremism, Facebook, Twitter, and the others CAN address THIS issue , they just have to be less lazy about how they categorize users and content. It’s simple, instead of following the largely illusory distinction between “left” and “right,” for everything (effectively building the previously fake distinction), rank content and users on a scale for each individual issue.

    The overly simplistic political curation mistake at the heart of Facebook and other social media platforms, is driving us to two increasingly extreme partisan camps. Fixing it is relatively easy, and could help steer us back to moderate sanity, even if it would not address the extremism problem. However, maybe if users aren’t constantly propagandized in echo chamber silos, it’ll be easier to find alternatives to promoting more and more extreme, enraging content to keep users engaged. Or maybe a turn to a different business model is called for. I’d be happy to opt-out of highly targeted political propaganda for $12 a year. More importantly, I’d happily pay again to opt my grandparents out…