Hacker Newsnew | past | comments | ask | show | jobs | submit | more boredgamer2's commentslogin

Very cool. This seems like such a "Duh. Users want this" feature. I wish it was integrated in Firefox years ago. I bookmark some sites and then come back years later, but perhaps they hijacked the URL or they later changed the URL's parameters.

And then when I return, I get a 404. Instead of bookmarking, I'd love to "capture" the current info, divs, and graphics.


You could do that with the Firefox extension "SinleFile" [1]. It main purpose is to save a full webpage as one single html file with images etc encoded in the html file (no more messy html file + folder cluttering), very neat. There is also a variant that is a html/zip hybrid called SingleFileZ [2] that gets smaller saved file.

In the settings you could set so that sites you bookmark is saved and also link the bookmark to the saved file locally if you want to.

[1] https://addons.mozilla.org/en-US/firefox/addon/single-file/ [2] https://addons.mozilla.org/en-US/firefox/addon/singlefilez/


That looks great! Firefox should have something like SingleFile built-in. Instead of just copying features from Chrome, they should lead the innovation and add useful features like this.


I used this in like IE5 (I think) with MHTML. Though after looking a little, looks like SingleFile is more featureful (gets lazy loaded stuff etc)


I've been using SingleFile for a couple of months now, to auto-save every page I visit, and it works well. It's exactly as simple as it needs to be. My only real complaints are that it can be slow on large pages and that it pollutes your download history (every auto-save shows up as a download), both of which I suspect are due to limitations in the extension API.


While it's far from an ideal solution, you can throw your bookmarks periodically to archive.org's "Save Page Now!" service. It's easy to semi-automate it - here's how I use it with pinboard.in bookmark exports: https://pastebin.com/uUVE22RD


This solution is on the user side, which is great because each person can get and manage saved pages for themselves.

But if we're looking for a developer side solution, then making pages that last an order of magnitude longer may be better for everyone in the long run, e.g. https://jeffhuang.com/designed_to_last/


archive.fo is also very good to use.


Joplin is a local thing you can run for this. Basically, saving a snapshot of a page. Has browser extensions.

https://joplinapp.org/

I like the ability to use tags too -- I've got various product/tech ideas and it's nice collecting information with it and not having to worry about the pages going away or changing.


+1 for the Joplin web clipper. I really like using Joplin for notes, especially the command line version.


Not a firefox extension, but I moved to using Polar Bookshelf[0] for exactly this reason.

[0] https://getpolarized.io/


I have been searching for a tool exactly like this. Thank you for sharing!


There's an option for this ("Misc. > save the page of a newly created bookmark") in SingleFile [1].

[1] https://github.com/gildas-lormeau/SingleFile/


Pinboard [0] has this feature with additional price. Just bookmark and you are done. Later you can text search in their content.

[0] https://www.pinboard.in/


Does it capture your view of the page, though (i.e. use your own browser, with its cookie jar, to do the scrape)? I'd like to, for example, snapshot my Facebook feed.


Tnen check out historysearch [0] If I remember correctly they index your history so it could include your history. You don't even need to bookmark.

Of course I cannot vouch for their respect for privacy.

[0] https://historysearch.com/


I made an application[0] that does capture your view. It's screenshot-based. It works outside of a browser too - anything on your screen. All local too, so no privacy concerns :)

[0] https://apse.io


Interesting. How do you/does Apse make $


The website shows a 'buy' button, and the pricing model.


I see that now, thank you!


No it does not. That means it also doesnt work for any news sites that you have subscriptions to. Im using the joplin web clipper pretty heavily for this purpose.


Afraid of lose the contents on the page, I used to save it as PDF on cloud. Far from ideal but happened way too much I go back to the page and get a 404 page error or so.


If enough people did this, it would provide more pressure for websites not to just nuke their old URL schemes every time they switch from wordpress to Drupal...


I would love an automated solution for this that is also smart enough to use my logins to access subscription based news sites for archival purposes.


A smart crawler.


you can do this, sort of.

web developer menu

toggle tools

select ...

select settings

turn on "take a screenshot of the entire page"

now you get a camera icon. clicking it snapshots the current webpage (as a veeery tall image)


Terraform is better in almost every way.

I disagree. I spent time in Terraform a few years ago working with a client and Terraform had the ability to create but not tear down resources for some services. I was shocked -- check out the Github issues history. I ended up writing a "bunch of bash script wrappers or similar around it".


Yea, this is exactly what I was talking about. It's not to say that Cloudformtation can't run into the same thing - deleting S3 buckets is difficult in any situation - but there's a lot of things Terraform doesn't/won't do, and often you're left with just making your own second layer of automation to work around.


> DEFCON: The waffle house of security conferences

I'm afraid I haven't been to enough waffle houses to understand if this is a good thing or bad thing?


They're referring to the Waffle House Index.

https://en.wikipedia.org/wiki/Waffle_House_Index


To bad the Wikipedia entry is completely wrong on when he coined the term. If it was in response to the July 2011 tornados in Joplin, then how come I have a video of him giving the WFI anecdote in November 2009 at RHOK#0 (Random Hacks of Kindness hack day).



it is never bad to be the Waffle House of a thing.


Though actual Waffle House locations will vary...


I'm not particularly convinced by the content of the article. But there is a glaring typo I hope the author fixes:

> 2. They do not know how WHY their copywriting formula works


Fixed. And thanks, that typo was really bad.


Get it on foreign embassy employees, and then if it shows up on one of your staff, you'll know they talked to that foreign embassy.


People don't necessarily touch, especially anymore and unlikely any longer.

MEMS camera "dust" with a/v recording, storage, and GNSS would be nice.


I've been wanting to re-create my blog & about/personal page. Are there similar write-ups for beautiful blogs/resume/about pages?


I like the Refactoring UI YouTube series.[1] Schoger takes a real website and makes incremental changes to make it gorgeous. He justifies his decisions and just generally gives you a good intuition for the why he makes a given tweak, which has helped me far more than any other tutorial or blog post I've followed.

[1] https://www.youtube.com/playlist?list=PLDVpvW8ghDr9tasku_Yvu...


I have a few links that might help.

  - Web Design in 4 Minutes https://jgthms.com/web-design-in-4-minutes/
  - Beginner’s Guide to the Programming Portfolio https://leerob.io/blog/beginners-guide-to-the-programming-portfolio
  - Improving My Next.js MDX Blog https://leerob.io/blog/mdx
There's also a great directory of personal sites here: https://personalsit.es/


great links, thank you!




It would be amazing to see the Gameboy/N64 emulator hacker community expand, and NEW games begin to be developed based on these docs. Maybe Nintendo could double-down on this and turn it into good publicity


The article mentions that this would be unwise due to these documents being released illegally. Reverse engineering some existing system for the purpose of interoperability (as is common among homebrew enthusiasts) is legal; hacking into a Nintendo subcontractor's IT and leaking internal documentation obviously isn't.


A developer can use the leak as a reference without being the one responsible for the leak. It sounds like it would be hugely beneficial.


If Nintendo can prove they used it as a reference, that's evidence for a nasty lawsuit.

There's a reason why people go to these lengths:

https://en.m.wikipedia.org/wiki/Clean_room_design


Many people already do make new games for Gameboy (and NES/Famicom) without these documents.


> counter cyclical industry

What are some examples of "counter cyclical industry"? I have not heard this term before..


How does it eject this energy to propulse through space, if it's just spinning in circles inside the engine?


There is a comment at the bottom of the article from an expert in the Air Force:

> William Hargus, lead of the Air Force Research Laboratory's Rotating Detonation Rocket Engine Program, is a co-author of the study and began working with Ahmed on the project last summer.

> "As an advanced propulsion spectroscopist, I recognized some of the unique challenges in the observation of hydrogen-detonation structures," Hargus said. "After consulting with Professor Ahmed, we were able to formulate a slightly modified experimental apparatus that significantly increased the relevant signal strength."

> "These research results already are having repercussions across the international research community," Hargus said. "Several projects are now re-examining hydrogen detonation combustion within rotating detonation rocket engines because of these results. I am very proud to be associated with this high-quality research."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: