(We detached this comment from https://news.ycombinator.com/item?id=46224867. It's fine and interesting, but the offtopicness of you-know-who is a bit too agitating at the top of the thread.)
If you read the article, Calibri usage was instituted during the Biden administration. So, there's probably a diversity of government styles that get involved with typefaces.
Forgive my ignorance but this seems to be one of the most neutral things Hitler did. He just didn't like the font so he ordered it to be changed. Equivalent to your boss ordering tabs be used instead of spaces. After the war was lost the arguments just continued. https://en.wikipedia.org/wiki/Antiqua%E2%80%93Fraktur_disput...
I guess if Russia invaded Western Europe and Putin decided to switch from Cyrillic to Latin script so the subjugated peoples would more easily read and learn Russian, that would be neutral too?
About the "bad argument", I can't argue with you, because I'm not the one arguing. You'll have to take it up with the author of these lines:
"In a hundred years, our language will be the European language. The nations of the east, the north and the west will, to communicate with us, learn our language. The prerequisite for this: The script called Gothic is replaced by the script we have called Latin so far"
(Besides, what's so strange about transposing Cyrillic to Latin? It happens all the time even today when people don't want to or can't switch keyboard layouts.)
I tend to agree with you, many people are passionate about typefaces, and dictators are no exception. [Passion about typeface] seems to be a low-signal detector for dictators. I'm passionate about lasagna, and I'll bet Mussolini was too -- but that probably doesn't mean I'm a fascist.
But if you go around and tell everyone you meet that they're doing it wrong and that lasagna MUST be prepared exactly the way you do it, because it's the one and only right way, then you're a lasagna-nazi :)
It didn't happen in isolation though. There were a few changes that used aesthetics as a culture influence and what being properly German should mean. Another one which was more explicit was music https://en.wikipedia.org/wiki/Music_in_Nazi_Germany It was literally anti the idea of diversity and inclusion. Much like this change.
And just like with the font, that shaped preferences for years.
That's still using their other culture choices to manufacture a problem with producing consistency in typeface. It's a stretch. Any good (don't take this out of context, please) leader will settle these kinds of trivial internal disputes and move onto important problems.
I'm not sure why you mention consistency. The cable explicitly says it's a) for the decorum and b) anti dei. That's literally the same reason for the music restrictions - that's why I'm bringing it up.
> He just didn't like the font so he ordered it to be changed.
There is your answer. He imposed his will - that's what dictators do. You have to be careful when the reason for any costly change is one individual's personal preferences. It's a bad omen.
> Equivalent to your boss ordering tabs be used instead of spaces.
That's not always equivalent, especially if it is to set a standard. Obviously, some people using spaces and the others using tabs is not ideal in situations you're referring to. It's also fine to change the standard, if they find a significant problem with the current convention. But if your boss wants it changed, and their only explanation is their dislike of the status-quo, then that's a red flag. The problem isn't very serious right now, but could grow into one in the future and you have to be on the watch.
While you'll get no argument from me about the Biden government being fascist adjacent, no. The font was chosen by that government for accessibility reasons. The font has now been changed for purely aesthetic reasons, attaching the politics of anti-DEIA to a particular aesthetic (serifed fonts).
As for the politics of that government, a history lesson; In 1930s Germany, Liberals did nothing to abort the rise of NSDAP, seeing them as economic allies if not political allies. They sold out their country and turned a blind eye to genuine evil for profit and the reduction of the political influence of their workforce.
Not. The problem is not even about which font is actually more accessible. It's the self proclaimed reasoning. Rubio, by his own words, states that the change is about aesthetics and anti DEIA politics.
However, if you want to argue about actual accessibility, which is not what is happening in the Dept. of State, the US government's own accessibility guidelines contradict the idea that Serif fonts are more accessible; https://www.section508.gov/develop/fonts-typography/
Do you happen to know anyone with a reading disability at all? A dear friend of mine has dyslexia, and I've seen first hand how important this stuff is for his comprehension.
Should text be made less accessible to read for everybody else, in order to accommodate people with dyslexia? Because everybody who reads a lot prefers serif fonts, since they are easier to read. That's why books are printed in serifs.
Since it's all digital, this all shouldn't be a problem in 2025.
That is so good to hear. I feel Rust support came a long way in the past two years and you can do a functional Rust kernel module now with almost no boilerplate.
Removing the "experimental" tag is certainly a milestone to celebrate.
I'm looking forward to distros shipping a default kernel with Rust support enabled. That, to me, will be the real point of no return, where Rust is so prevalent that there will be no going back to a C only Linux.
A few distros already do that. Of the top of my head, both NixOS and Arch enable the QR code kernel panic screen, which is written in Rust. Granted, those are rather bleeding edge, but I know a few more traditional distros have that enabled (I _think_ fedora has it? But not sure).
For now there aren't many popular drivers that use Rust, but there are currently 3 in-development GPU drivers that use it, and I suspect that when those get merged that'll be the real point of no return:
I suspect the first one of those to be actually used in production will be the Tyr driver, especially since Google's part of it and they'll probably want to deploy it on Android, but for the desktop (and server!) Linux use-case, the Nova driver is likely to be the major one.
I believe Google has developed a Rust implementation of the binder driver which was recently merged, and they are apparently planning to remove the original C implementation. Considering binder is essential for Android that would be a major step too.
This is so cool! I love things like that, it feels like fresh air after years and years of anachronistic retro-vibes that seem to be a part of C-programming culture.
Arch never failed me. The only time I remember it panicked was when I naively stopped an upgrade in the middle which failed to generate initramfs but quickly fixed it by chroot'ing and running mkinitcpio. Back up in no time
> I'm looking forward to distros shipping a default kernel with Rust support enabled
What does this mean? The kernel has no specific support for Rust userspace -- the syscall API does not suddenly change because some parts of the kernel are compiled with rustc.
As I understand it, the kernel has a flag, no by default, to enable building rust code as part of the kernel. They’re saying they’d like to see a distributed choosing to default this flag to yes for the kernels they build.
Because someday the programming state of the art must advance beyond 1970, and someday we must stop introducing new memory safety bugs that cause horrific vulnerabilities, and for kernel code we don't have the luxury of recompiling everything with garbage collection turned on as a band-aid for C's incurable defects.
People doing open-source work often feel very tribal about their code and block ideas that are good but threaten their position in the community. Essentially same thing as office politics except it's not about money, it's about personal pride.
Why? It depends on what you want to replace, but the Rust project has a long history of using tools written in other languages. Python is used a lot in build tooling, bors-ng was written in Elixir, llvm is written in c, ... gcc-rs doesn't contain a lot of rust code either, it's purely other languages according to https://github.com/Rust-GCC/gccrs
Fundamentally, if a tool is good and provides benefits then why not use it? You'll be met with concerns about maintainability if you just airdrop something written in Ada/SPARK, but that's fair - just as it was fair that the introduction of Rust in the linux kernel was indeed met with concerns about maintainability. It seems that those were resolved to the satisfcation of the group that decides things and the cost/benefit balance was considered net positive.
If anyone submitted such a PR to one of my projects and could explain compelling benefits for why doing so would be a general improvement, even considering trade-offs of increased complexity/maintainability, etc., then I'd be delighted they'd cared enough to improve my software.
How is the drive-by statement of a random GH account with 9 followers representative of any community. What's the point you're trying to make? That there's people with shitty behavior in the Rust community? No surprise here, there are. That there's trolls out there that just do this for fun? It's the internet! I hope that doesn't surprise anyone by now.
How is the one comment from a GH account with 18 followers, no contribution to the Rust project (or any Rust based project at all) and mostly contributions to javascript projects in any way representative of the Rust community? Especially taken out of context - in context it seem like a failed(?) attempt at humor or sarcasm.
Do I treat your post from an obvious throwaway account created 13 minutes ago as somehow representative of the C community or the Linux kernel community or for that matter as representative of any community at all?
Come on, please. There's a ton of things that I consider worth of critisicm in the Rust community, but make a better case.
> Do I treat your post from an obvious throwaway account created 13 minutes ago as somehow representative of the C community or the Linux kernel community or for that matter as representative of any community at all?
For all we know it could be the same person behind both the GitHub post and the Hacker News throwaway you're speaking to.
I’m not really sure what point you’re making. All I see there is - or was - some unclear management within Linux itself.
- There was an experiment to have rust in Linux. It got provisional approval from Linus.
- Some people in the Linux kernel blocked essentially any changes being made to make the rust experiment able to succeed.
- Some people who were trying to get rust into Linux got frustrated. Some quit as a result.
- Eventually Linus stepped in and told everyone to play nice.
This whole drama involved like 5 people. It’s not “the majority of the rust community”. And it kinda had nothing to do with rust the language. It was really a management / technical leadership challenge. And it seems like it’s been addressed by leadership stepping in and giving clearer guidelines around how rust and C should interoperate, both at a technical and interpersonal level.
So what’s your point? Is rust bad because some people had an argument about it on the Linux kernel mailing list about it? ‘Cos uh, that’s not exactly a new thing.
Add in theseus, tock, hubris, and hermit-os. That is just the non academic ones. As for why none of them are widely used? Drivers. It wasn't that long ago redox didn't even support usb devices. The linux kernel is a giant mashup of oodles of drivers.
The "provide value" argument can be used for anything. Modern software politics and foundation sponsorship pressure are so complex that this argument may not even be true.
It may be true in this case, but certainly you have seen corporate bloat being added to "open" source projects before.
> Modern software politics and foundation sponsorship pressure are so complex that this argument may not even be true.
May not, or may yes. As far as I know with my own interactions with the Linux kernel community, is that it's very different from typical "modern software politics", at least different enough that I'd make the claim they're special from the other communities.
If there is any community I'd assume makes most of their choices disregarding politics or pressures and purely on technical measures, it would be the Linux kernel community. With that said, no person nor community is perfect, but unless there is some hard evidence pointing to that the Linux people were forced to accept Rust into the kernel for whatever reason, then at least I'd refrain from believing in such theories.
You're right that I'm appealing to authority, but that doesn't make my argument invalid.
The people who decided rust has value in the kernel are linux kernel developers. Ie, traditional C developers who see value in using rust in linux. Rust in linux has caused plenty of headaches. If rust didn't add commensurate value to make the trouble worth it, it wouldn't be getting adopted.
They wrote their own language (C) too. They invented a new language because the current crop of languages didn't suit their needs. Your argument ignores the parts of history that are inconvenient and highlights the ones that you think support it.
"Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own."
People arguing for UNIX/C, also tend to forget the decade of systems languages that predates them, starting with JOVIAL in 1958.
They also forget that after coming up with UNIX/C, they went on to create Plan 9, which was supposed to use Alef, abandoned its designed, later acknowledege lack of GC as an key issue, created Inferno and Limbo, finalizing with contributions to Go's original design.
"Alef appeared in the first and second editions of Plan 9, but was abandoned during development of the third edition.[1][2] Rob Pike later explained Alef's demise by pointing to its lack of automatic memory management, despite Pike's and other people's urging Winterbottom to add garbage collection to the language;[3] also, in a February 2000 slideshow, Pike noted: "…although Alef was a fruitful language, it proved too difficult to maintain a variant language across multiple architectures, so we took what we learned from it and built the thread library for C."[4]"
I think when Fish shell announced the Rust rewrite, they especially highlit that, in the form of "being more attractive to contributors" as one of the reasons.
In-general it seems that Rust proponents want ease and modernity to replace the ability to do the same in C better, with more control and intent, and with better results.
Rust is less tight. That’s fine for an application, but the kernel is used by everyone who uses Linux.
Rustians disparage C because it doesn’t have Rust behavior.
I don't think C should be disparaged at all. Just because I prefer torx screws doesn't mean phillips screws were a horrible idea. They were brilliantly simple and enormously effective. I can't think of a single situation in which I wouldn't prefer torx, but torx wasn't an option historically and phillips was not the wrong decision at the time. Times change.
Me too, definitely. Should I get bored I could always go about and insult every being that ever lived and will live in the entire universe - in alphabetical order.
In git you can have only one worktree per branch. For example, if you have a worktree on main you cannot have another one on main.
I personally find this annoying. I usually like to keep one pristine and always current working copy of main (and develop if applicable) around for search and other analysis tasks[1]. Worktrees would be ideal and efficient but due to the mentioned restriction I have to either waste space for a separate clone or do some ugly workarounds to keep the worktree on the branch while not keeping it on the branch.
jujutsu workspace are much nicer in that regard.
[1] I know there are tons of ways search and analyze in git but over the years I found a pristine working copy to be the most versatile solution.
You probably know this, but for others that don't: local git clones will share storage space with hardlinks for the objects in .git. The wasted space wouldn't be a doubling, it would be the work tree twice plus the (small) non-hardlinked bits under .git. No idea how LFS interacts with this, but it can be worth knowing about this mechanism.
Also, if you end up relying on it for space reasons, worth knowing that cloning from a file:// url switches the hardlink mechanism off so you end up with a full duplicate again.
> In git you can have only one worktree per branch. For example, if you have a worktree on main you cannot have another one on main.
You can detach the worktree from the repo, and checkout multiple branches at the same time to different locations. Not sure if this also allows checking out the same branch to multiple locations at the same time. You can also have a swallow clone, so you don't have to waste space for the full repos history. So at the end you still have to waste space for each worktree, but this isn't something jujutsu can avoid either, or can it?
This restriction of git worktrees is annoying but I just learned one simple rule to follow:
Never check out the main development branch (main/master/develop/etc) in other worktrees (non "main worktree", using git-worktree nomenclature)). Use other name with "wt-" prefix for it. Like in:
And to be honest, after being disciplined to always do that, I very rarely get error message saying that the branch I want to check out is already checked out in other worktree. Before that, I regularly had a situation when I checked out main branch on second worktree to see the current state of the repo (because my main worktree had a work in progress stuff) and after some time when I finished work on main branch, I tried to check out main branch on my main worktree and got the error. Because I totally forgot that I checked it out some time ago in the other worktree.
That sounds like a nice improvement, just like many other aspects of jj!
Tools should adapt to us and not the other way around, but if you are stuck with git, there's a slightly different workflow that supports your use case: detached head. Whenever I check out branches that I don't intend on committing to directly, I checkout e.g. origin/main. This can be checked out in many worktrees. I actually find it more ergonomic and did this before using worktrees: there are no extra steps in keeping a local main pointer up to date.
The detached head is what I meant with keeping it on the branch while not keeping it on the branch.
The complication comes from trying to stay current. With a regular worktree I could just pull, but now I have to remember the branch, fetch all and reset hard to the remembered branch.
> In git you can have only one worktree per branch.
Well, that is true, but a branch is nothing more than an automoving label, so I don't see how that is limiting at all. You can have as many branches as you like and you can also just checkout the commit.
Images could be doctored convincingly at least since the 1920s and yet people used "pics or it didn't happen" unironically nearly 100 years later. Video will not be different. People believe what they want to believe and will
never let get truth in their way.
These processors were very very different from what we have today.
They usually only had a single general purpose register (plus some helpers). Registers were 8-bit but addresses (pointers) were 16-bit.
Memory was highly non-uniform, with (fast) SRAM, DRAM and (slow) ROM all in one single address space.
Instructions often involved RAM directly and there were a plethora of complicated addressing modes.
Partly this was because there was no big gap between processing speed and memory access, but this makes it very unlikely that similar architectures will ever come back.
As interesting as experiments like LLVM-MOS are, they would not be a good fit for upstream LLVM.
> ... there was no big gap between processing speed and memory access, but this makes it very unlikely that similar architectures will ever come back. ...
Don't think "memory access" (i.e. RAM), think "accessing generic (addressable) scratchpad storage" as a viable alternative to both low-level cache and a conventional register file. This is not too different from how GPU low-level architectures might be said to work these days.
Great point. And you can even extend that to think like a 6502 or GPU programmer on an AMD, ARM or Intel CPU as well if you want the very best performance. Caches are big enough on modern CPUs that you can almost run portions of your code in the same manner. I bet TPUs at Google also qualify.
When I grew up in Germany it always made me proud that 100% of taxis were Mercedes Benz. If a car can withstand the rough demands of taxi service, it has to be good. And even in South America back then German cares were ubiquitous, especially Volkswagen.
When I was in Brazil this spring[*] I rode a lot of Uber and they were 100% BYD - 100%, no exception. It's not that my head hadn't known that German auto was dead but seeing it playing out like this hit hard.
BYD recently went live with a highly automated, large scale manufacturing facility in Brazil. The BYD Dolphin Mini sells for ~$22,500, and the manufacturer already has 200 showrooms open across the country.
You have to consider that South America is the most dangerous continent.
You can't just leave your car charging unattended in a public space. It has to be done at home or somewhere closed (which would make it expensive) or you would have to watch over your car (which would take a lot of your time).
Mercedes and BMW serves a wider band of value than they do in the USA, where they have purposefully cultivated themselves as a pure luxury brand. For example, the 1 and 2 series from BMW or Mercedes A class will never go the USA, even though I’m sure there is a market for them.
They used to make them with quality construction. Now it's all engineered with plastic bits that will only last for the first 5 years the rich owner will be using it before tossing it out.
I remember being in a Mercedes in France in the 80's and noticing it had manual crank windows. My dad in the US (even then!) hated added electronics in cars so he went to a Mercedes dealer and they explained that in the US we could only get fully loaded models.
1. Batteries - BYD has them beat
2. Self Driving tech - other players are better
3. Luxury brands already provide the luxury aspect & even better built cars
4. in the US they're being saved by US protectionism. in Europe etc - we already see the chinese brands making inroads for EV sales
I mostly agree on all points, but what self driving tech is better? I've periodically looked at the options, and nothing really seems to compare in North America. Maybe BYD and others have great tech, but stuff like Blue Cruise works hardly anywhere in Canada, and to me, that makes it virtually useless.
He’s probably thinking of robo taxi self driving. So that would be e.g. Waymo.
I don’t think anyone has better self driving for consumers out atm, but you could argue that’s because other companies are not using their customers as beta testers. I’ve seen demos that may indicate Mobileye has tech that’s just as good if not better. But they don’t release it to end users until it’s fully ready.
I don’t think Tesla has any special sauce, and that when the tech is actually ready for unattended full self driving in a consumer car, other car makers will come out with solutions around the same Tesla. One difference is maaaybe Tesla will be able to update old cars (probably with a hardware update). While I think others will only support it on new cars.
> Tesla will solve self-driving and everyone will be left unable to compete. Also, AI is advancing rapidly and will solve all kinds of problems for society.
But apparently it will not solve self-driving for anyone else but Tesla.
I gave up trying to argue with Tesla fans years ago. They are immune to logic which invalidates their priors.
http://comma.ai isn't self-driving, just really good cruise control (better than Blue cruise, imo), but most importantly, you can get it today. (And for less than $8k.)
Protectionism on inputs kills manufacturing. Imagine having to pay 15% more for all inputs and trying to compete with someone who doesn't have pay that.
Well, at least domestically you don’t have to compete with someone who doesn’t have to pay that because their product is probably tariffed directly.
Internationally, yes if you manufacture the international product in the home country, but AFAIK in auto at least there are usually satellite factories and have been for some time, and those wouldn’t be subject to home country tariffs would they?
Precisely, you need to set up plants overseas to dodge the input tariffs instead of onshoring manufacturing for export. That causes reductions in manufacturing investment compared to the alternative.
In that hypothetical they can't compete because of labour protectionism and immigration restrictions, not because there's some intrinsic reason it's too expensive to manufacture in one country versus another.
Not only that, Teslas nowadays look like they belong in a museum - so fucking old and outdated. I own 2014 Tesla S, my neighbour has 2025 Tesla S, same fucking car - literally. My car was THE shit back in the day, lots of broken necks looking at it … now, 12 years later, someone (fewer and fewer) is paying $90k for exactly the same car. Tesla X was great looking - circa 2016… Tesla 3 is like a Kia and Model Y is just 3 that is blown up a bit.
> My car was THE shit back in the day, lots of broken necks looking at it … now, 12 years later, someone (fewer and fewer) is paying $90k for exactly the same car
That's what Porsche also discovered, the hard way.
Tesla also has a big Elon problem in that the blue cities where self-driving Taxis will be most profitable may opt for Waymo or boycott Tesla over politics.
I think the issue is to create an ICE is a very complicated process requiring lots of specialist knowledge, skills and technologies. An EV is just much simpler, comes down to who has the cheapest batteries. Europe and Japan are great at the former, the latter no chance.
Im sure some of it is personal bias from experience with them but I don't think ICE are as complicated as some people think. 90% of the extra shit on them are unnecessary for it to work but what those things are and what they do and why it broke or failed or how important they are is essentially obfuscated from the general public so they seem like overly complicated magic. The vast majority of cars I see do not fail or get trashed due to engine failure from design flaws or anything, most get trashed because people stop caring about them and treat them like trash and don't replace that $15 sensor, others think they can't afford the maintenance because car manufacturers don't give a fuck about having to take 3 hours disassembling other unnecessary shit to replace a 30 cent sensor that they know will eventually need replaced, but the only number the customer seems to look at is the total cost of the mechanic quote. They think because something is a $1,000+ repair that something seriously is worn or old and that the car is on its last legs, instead of the reality of that one part being a huge pain in the ass to replace but it is otherwise a good reliable motor for another 100,000+ miles. And of the cars that do get trashed because they have actual major mechanical problems, the vast majority of the problems have to do with the body work rusting out and/or suspension components needing replaced after being used for 3x their expected lifetime, which an EV is not going to improve in any way.
Like ive seen people junkyard perfectly working and good cars because it is over 150,000 miles and some service guy who is looking for work/money told them they need to do scheduled maintenance some time soon and they thought the car was too old and was junk. And yet very few cars ive seen would not make it over 300,000 miles if they spent even 1/10th the money of their new car for maintenance on their old.
Thought the comment was somewhat helpful. Sparked considering the various anti-patterns in automobile design and searches came back with several others that have been vaguely thought about, just never really identified very clearly for me.
- Inaccessible Components (Poor Physical Layout): One of the main ones you're talking about. Take out the engine to repair a light on the dashboard.
- Integrated, Non-Modular Systems: Minor damage or failure ruins an entire assembly. You dinged the bumper, replace the entire front.
- Lack of Standardization: Even from year-to-year, designs change and mechanics have to learn yet another system.
- Forced Replacement over Repair: Object is "black box", thou shalt replace, not repair.
- Dead/Onion/Boat Anchor Components: No longer used, maybe need it, build stuff on top of it, layer after layer, "can we even remove it"?
- Spaghetti Wiring/Code and System Coupling: Single modules that route all over the car, another "can we even remove it"?
- Proprietary Diagnostics and Restricted Data Access: Don't have the special tools, you can't repair, or even find out what's wrong.
EVs are very complicated cars anyway. They need maintenance in service as well as ICE cars. Yes, not so often you need to change liquids, but service is required. Also good luck with water/rodent damage to 400v parts.
Optimizing costs while producing a safe, reliable, durable vehicle isn't exactly simple and requires an entire supply chain to be in place, not just a single company. Look at how many auto mfgs there are in the world that turn out terrible cars. EVs dramatically lower the parts count which helps but you still a lot of expertise to make a safe, reliable car.
My grandfather was a mechanic and told me how replacing a dashboard light in some models required removing large portions of the engine to access the socket.
To be fair taxis have unique requirements. Taxis in the UK were like 80% Prius for a long time because they drive very long distances and hybrids are very cost effective for city driving where you're doing a lot of low speed driving and don't have convenient recharging opportunities. But most people aren't in that situation.
Unless you have a really cheap electricity at home (not like in EU) the best price per km has old Prius with LNG fuel. Also reliable, there are tons of them with 500k+ km on the clock.
I've been riding a German electric motorbike for a couple of years, and before that, German electric mopeds.
I think there is a lot of innovation in the German electric vehicle industry. I am quite excited for BTM, my bikes manufacturer, to design and release new versions of their platform. This model is distinctly German.
Note the Mercedes as taxis in Germany are not the high end luxury car imports we are accustomed to seeing in the US. Mercedes makes a lot of more affordable cars for their domestic market we never see here!
Sadly the German car industry has lost its way in the EV transition and is now vainly trying to get the EU to rollback the sun setting of ICE car sales in 3035.
In the UK, for a very long time they were Skoda Octavias.
I know of two ex-taxis that were scrapped at about five or six years old - one was taken off the road because of a deep paint scratch down to bare metal from about half way along the front wing to the rear door, rendering it beyond economic repair - with over half a million miles on the clock each.
Neither had been outside the Greater Glasgow area since they were dropped off on the transporter.
In 1941 Adolf Hitler personally gave order to make the use of the Antiqua mandatory and forbade the use of Fraktur and Schwabacher typefaces.
https://ligaturix.de/bormann.htm
reply