I wonder who originally authored the font, and when it was created. The site cites Leroy Lettering as the likely origin, so presumably it was someone there?
Why? Assume that the value of these altcoins do not change, and see how many you can mine given a certain amount of time. The value of altcoins could very well go down, but for the sake of demonstration you could still calculate a very rough estimate, no?
> Assume that the value of these altcoins do not change, and see how many you can mine given a certain amount of time. The value of altcoins could very well go down, but for the sake of demonstration you could still calculate a very rough estimate, no?
And likewise if you assume values for the cost of apple seeds land, see how many apples you can grow given a certain amount of time - and the value of apples could very well go down - but for the sake of demonstration you can also calculate a very rough estimate of apple orchard profitability, no?
One problem is that the cost of electricity varies enormously. In some times and places it can cost many times the average. In others it can be literally free.
I guess this is great for contemporary squatters: steal electricity from the grid, get paid for the electricity used to heat your squat, by mining and selling bitcoin...
I suppose they mean taking Apple's "We know what is good for you better than you do" approach. While Apple currently does not force OS upgrades, they do make a lot of decisions for their users without giving their users an opt-out or a way to roll back. This idea top-down, walled garden is pretty central to contemporary Apple.
Apple makes design choices that are very opinionated on how you should use software and internals are hard to see, but they don't make decisions for you.
They decided for me that my updated iphone 4s should never be able to be restored to factory condition and I would be forced to suffer terrible performance and battery life.
They decided for me that all my apps should auto-update. This incentivizes application developers putting their users on a constant upgrade treadmill (similar to apple). I guess there is so much amazing "Innovation" happening in every update that they decided that users absolutely must want every single update. This has caused developers to never consider a software feature-set as released or finished and free of bugs. They have taken this horrible mindset that only existed on the web and brought it to native apps.
Then there is the heavy handed approach of forcing you to update your entire OS just to run a developer IDE. Something previously _unheard_ of and frankly embarrassing from an engineering point of view.
Well folks, that's all the time we have for this segment today, tune in tomorrow for more.. :)
One obvious example is the fact that Apple does not allow you to install anything but the latest OS (and possibly one minor revision back) on an iOS device. OS upgrade kills your battery life or makes a critical third-party app stop working? Too bad, so sad, sucks to be you, there's no way to go back.
They don't force you to upgrade, but they artificially block the ability to go back.
Example:
Apple held my iPad 2 hostage until I stopped using iOS 8. I forgot my unlock pin (because I lent my iPad to someone else) and there is no way to factory reset it without also updating the OS.
So now Flux no longer works, and there is no way to adjust the gamma on this device, which means I no longer like reading on it before bed. And there are plenty of other regressions in iOS 9 as well. And they forced me to update my desktop just to interact with it.
SO, Apple very much forced me to update, if I wanted to keep using my device. There is no reason that updating iOS should be required to factory reset it.
Isn't that making a decision for you? "users will only ever need to install software from our central app store" is deciding for the user that they don't need to be able to run whatever software they want on their device. It's fine if, as a user, you decide you don't need that capability, but you are still having that decided for you.
Disclaimer: I don't know the current status of 3rd party app on iOS but my point still stands.
They decided that ios9 on 5S should stutter while scrolling throughout the OS. A smooth working device experience on 8 was broken by an update and 10 minutes later with no option to revert anymore.
Interesting it coincided with release of SE couple months later...
All vendors suck, they just all have their own incentive.
About a year ago I was living in a place with a broken doorbell so I wrote a web page that would send me a text when it was loaded (hosted on my Pi). I then printed a QR code for the page that stated clearly "To ring bell, scan code". Users would scan the code which would immediately text "Ding Dong!", the page would let them enter their name. If they did I would receive a follow up text "It's [name]". Sadly in my year of living there only a handful of people ever used it.
Don't forget the people like myself with learning disabilities that require tech to organize and communicate. If schools were more willing to accept students who needed to use technology rather than writing them off as lazy or 'slow' the world would be a better place.
Yes, in principle. If they are not obfuscating and/or encrypting their data, which is a big if. And you'd be bound to keeping that packet sniffer online forever, piping matching data to where you need it.
GPU's are really good at parallel tasks such as calculating the color of every pixel on the screen, or doing the same operation on a large dataset. According to Newegg, the GTX980 has 2048 CUDA cores (parallel processing cores) that run at ~1266 MHz as opposed to a nice CPU which might have 4 cores that run at 4 GHZ. In other words, if you want to manipulate a whole bunch of things in one way in parallel, you can program it to use the GPU effectively, if you want to manipulate one thing a whole bunch of ways in series, CPU is your best bet.
Coarse rule-of-thumb: running on Geforce class GPUs you can get up to 5x, maaaybe 10x the performance per dollar as compared to a top-line CPU. Assuming your problem scales well on GPUs, many problems don't. The GTX980 is actually a great performer. For Tesla class systems like the K40 it's a lot closer to equal with the CPU on performance/$ (they're not much faster than the GTX980 but a lot more expensive). But you can get an edge with the Teslas when you start comparing multi-GPU clusters to multi-CPU clusters, since with GPUs you need less of the super-expensive interconnect hardware. (You're not going to put GTX cards in a cluster, you'd have massive reliability problems.)
IMHO, the guys showing 100x speedups on GPUs are Doing It Wrong; they use a poor implementation on the CPU, use just one CPU core, consider a very synthetic benchmark, or a bunch of other tricks.
Can someone explain why this is not just something that Microsoft should patch in windows? i.e how is this not just a windows vulnerability that you can use TrueCrypt to take advantage of? Why are drivers able to escalate privilege at all?
A driver is kernel-mode code that's written in C. It can do just about anything, and when there's a bug, you're in trouble.
I'd like to see Microsoft allow more drivers to run in user-mode, but this is just the risk you take when installing drivers. Microsoft has been tightening driver signing requirements, so you can at least be sure they're from a known source.