• 1 Post
  • 33 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle

  • I was curious about this because I felt like this has been a problem for longer than that, and after a bit of searching I found this random forum post that compiled some stats around week 1 rookie starters. It’s from 2021, so a bit out of date, and the formatting is a bit confusing, but it’s still interesting to look at.

    Filling out the list after 2020 from memory and some quick Googling, 2021 would add Trevor Lawrence, Mac Jones, and Zach Wilson, 2022 had no week 1 rookie starters (Kenny Pickett wouldn’t start until week 4) for the first time since 2007, and 2023 had CJ Stroud, Bryce Young, and Anthony Richardson.

    As for trends, there was a meaningful bump after 2000, as 10 rookies started week 1 between 2000 and 2010 compared to just 3 across the 90s. The insane 2012 class really kicked it into high gear though.

    I think the league’s increased passing focus caused it to happen more often, but usually out of desperation or with players considered to be generational prospects, but then 2012 seemed to give everyone the impression that it could happen regularly.



  • I think the NFL’s desperation for quality QBs is likely making the problem worse. So many top QB prospects get drafted to dysfunctional franchises with incompetent coaches and massive holes across the entire roster, then get dropped into week 1 with the expectation that they’ll be the savior of the franchise.

    Circumstances matter a lot, and I can’t help but wonder how many “bust” QBs would’ve been better off with a year or two in a low-pressure backup spot to adjust to the league and learn the scheme instead of getting thrown into the fire right away. It seems to work for the Packers. Hell, even Mahomes sat behind Alex Smith his rookie year. I wonder how different his career would be if he had been sent to start immediately…




  • Not as drastic as the headline makes it out to be, or at least so they claim.

    “We acquired Tumblr to benefit from its differences and strengths, not to water it down. We love Tumblr’s streamlined posting experience and its current product direction,” the post explained. “We’re not changing that. We’re talking about running Tumblr’s backend on WordPress. You won’t even notice a difference from the outside,” it noted.

    We’ll see how that actually works out. Tumblr’s backend has always seemed rather… makeshift, so I’m curious to see how they manage to do that. Given Tumblr’s technical eccentricities, a backend migration could probably do a lot of good for the functionality of the site, if done properly. I have my doubts that WordPress’ engineers will be given the time and resources to do a full overhaul/refactor though, so I’m fully expecting even more janky, barely functional code stapling the two systems together.








  • “Product Degradation” has been the modus operandi for nearly every online service for like 10-15 years now, but it’s the Gamepass price increase is what got the FTC’s attention? Where was the FTC when the movie/TV streaming service market balkanized itself in an arms race to reinvent cable?

    Granted, I doubt the FTC could really do anything meaningful to stop enshittification given that corporations are effectively above the law these days, but it’s been blatantly obvious that this was going to be Gamepass’ strategy from day one. If this actually surprised anyone at the FTC, they really haven’t been paying attention.


  • Reminds me of when they started doing that thing where they pretended to be helpful by having the GPS voice call out the name of a business on the corner where your turn is - “Turn left after ‘business’ on the left” - but in reality those businesses were paying to inject their name into your driving directions.

    When it started, I immediately suspected they were possibly paid sponsorships, which was all but confirmed when it told me to turn “after Bank of America, with drive-thru ATM, on the right.” Stealth advertising mid-navigation… insane.







  • I hate that the focus of AI/ML development has become so fixated on generative AI - images, video, sound, text, and whatnot. It’s kind of crazy to me that AI can generate output with the degree of accuracy that it does, but honestly, I think that generative AI is, in a sense, barking up the wrong tree in terms of where AI’s true strengths lie.

    AI can actually turn out to be really good at certain kinds of problem-solving, particularly when it comes to optimization problems. AI essentially “learns” by extremely rapid and complex trial-and-error, so when presented with a problem with many complex, interdependent variables in which an optimal solution needs to be found, a properly-trained AI model can achieve remarkably effective solutions far quicker than any human could, and could consider avenues of success that humans otherwise would miss. This is particularly applicable to a lot of engineering problems.

    Honestly, I’d be very intrigued to see an AI model trained on average traffic data for a section of a city’s street grid, taken by observations from a series of cameras set up to observe various traffic patterns over the course of a few months, taking measurements on average number of cars passing through across various times of day, their average speed, and other such patterns, and then set on the task of optimizing stoplight timings to maximize traffic flow and minimize the amount of time cars spend waiting at red lights. If the model is set up carefully enough (including a data-collection plan that’s meticulous enough to properly model average traffic patterns, outlier disincentives to keep cars at little-used cross streets from having to wait 10 minutes for a green light, etc.), I feel that this sort of thing would be the perfect kind of problem for an AI model to solve.

    AI should be used on complex, data-intensive problems that humans can’t solve on their own, or at least not without a huge amount of time and effort. Generative AI doesn’t actually solve any new problems. Why should we care if an AI can generate an image of an interracial couple or not? There are countless human artists who would happily take a commission to draw an interracial couple (or whatever else your heart desires) for you, without dealing with investing billions of dollars into developing increasingly complex models built on dubiously-sourced (at best) datasets that still don’t produce results as good as the real thing. Humans are already good at unscripted creativity, and computers are already good at massive volumes of complex calculations, so why force a square peg into a round hole?