Interesting announcement from Build: Developers can run Bash Shell and user-mode Ubuntu Linux binaries on Windows 10 So, what’s that all about? Things seem very unclear to me at the moment, but I have not had the time to read loads of posts that spring up here and there, or to determine whether some of them are less secretive than Scott’s post I linked to above. The promises are altogether a bit strange and conflicting. For instance, Scott says
This isn’t Bash or Ubuntu running in a VM. This is a real native Bash Linux binary running on Windows itself.
I was just looking into the whole topic of drawing something on screen in a .NET WinForms application, so that it appears in the “correct” size. I found a number of issues around the handling of screen resolution and DPI, and in order to remember what’s what, and perhaps help somebody with the same problems, I thought I’d write it all down.
It’s weird: in Vista, a lot of people were always complaining about the way UAC interrupted their lives. I didn’t think it was a big deal at all, it just didn’t bother me. Now in Windows 7… well… I’m not sure what it is, but I find UAC a bit odd now. The reason is simple: it continuously pops up when I don’t expect it, causing me to waste time. For instance, I run an installer and go through 8 wizard pages, and I click the final “will you please get on with that installation, FFS” button, I see the status bar for the installation process pop up in wizard page no. 9, I walk away or at least alt-tab out of that VM to do something useful in the meantime. A little later I think it’s done, I switch back to the VM, and… I see the UAC dialog hanging there, waiting for me. It hasn’t installed a single byte. Brilliant.
That’s weird, it never seemed to happen like that before. I haven’t gone back and actually compared in detail, but I just never had that impression with Vista. Did that thing come up earlier or something? I believe it might have. On an install I just did, I noticed that there’s a link in the UAC dialog saying something like “change when this dialog appears”. Sounded great to me, just like what I’d want to do, and I clicked it. Result no. 1: the installer that brought up the dialog decides that I canceled the installation. No, that’s not right. That sucks. Result no. 2: I can’t decide when the dialog comes up, like the links promises, I can only decide (like it was always possible, just the dialog is now nicer) under which circumstances UAC requires special confirmation and whether or not to use secure desktop for it. So – Win 7 UAC better than before? I don’t know. Seems more irritating to me.
It appears I’ve managed to do a pretty stupid thing - personally I rather feel like Windows let me do a stupid thing, but that’s probably a matter of perspective 😉
Anyway, I was trying out some things with UAC, looking at the way different elevation prompts are displayed depending on whether or not the current user is an Administrators member or not, and so on. Previously I was working in what appears to be the default configuration, which my user account being a member of the local Administrators group and the Administrator account disabled and without password. I activated the Administrator account and gave it a password. Then I removed my own account from the Administrators group and added Power Users instead. I played around with this setup and then decided to go back where I came from - only I made a mistake, I disabled the Administrator account first, before assigning the Administrators group back to my own account.
Using Local Security Policy settings (specifically the one called “User Account Control: Switch to the secure desktop when prompting for elevation”), I can modify Vista’s behaviour when showing elevation prompts, so that the elevation dialog is simply shown as a normal window on the current desktop. I like this behaviour much better, but I’m wondering if there’s any security related reason why the so-called secure desktop is actually more secure. If you know, please tell me!
There’s a pretty big problem with shell extensions when you try to run 64 bit Windows – they either come in 64 or 32 bit varieties, and each “group” of extensions will only be loaded into a corresponding host application. In other words, Windows Explorer runs as a 64 bit executable by default and it only loads shell extensions for 64 bit. Many alternative file manager tools only come in 32 bit versions so far, and they will only load 32 bit extensions.
Some extensions come in both 32 bit and 64 bit versions, and it should be possible to install them both. I’ve seen some extensions do this automatically and then it works fine – but some other vendors distribute both versions of their products with exactly the same file naming, so you can only install either one of them. The whole thing seems a bit stupid to me… I wonder if it wouldn’t be possible to create a 64 bit shell extension that includes the 32 bit extensions in the 64 shell. Yes, I know I can’t load the extensions directly, but I could run a separate 32 bit executable that loads them and passes over all relevant information to the 64 extension – and the other way round of course. This could work both ways actually, also enabling 64 bit extensions to be available in 32 bit hosts. Of course that’s quite a bit of information that has to be transferred very carefully, mapped correctly and so on… just the sort the thing that Microsoft could have done in Windows, probably not the sort of thing I’m going to take on myself over the next weekend.
Just stumbled upon this due to some confusion about the version of PowerShell I was running on my 64 bit Vista machine. I ran PowerShell from the start menu, using the context menu to bring it up as Administrator. Then I set the execution policy to RemoteSigned:
> Set-ExecutionPolicy RemoteSigned
I was fighting some problems with SlickRun for a while after installing 64 bit Vista on my machine recently. I noticed issues with powershell (SlickRun was always running the 32 bit version of it, even though the path explicitely specified the 64 bit version), and when I wanted to create a command to run SnippingTool, it turned out to be impossible… there are some very weird things going on in 64 bit Windows.
For example, Explorer shows me
system32? Go figure), while my 32 bit Altap Salamander doesn’t show the same file in that location. In SlickRun it was equally impossible to configure my command to use that path, and when I tried to use the path to the start menu .lnk file instead, SlickRun’s path completion feature actually showed the .lnk file, but running it was still impossible. After a little search, I found this forum post here, which describes an option for 64 bit systems. Quoting (I don’t trust forum content to be there when I go back later):
I just had a weird problem – for some reason IE wouldn’t run my XAF (eXpressApp Framework) web applications (ASP.NET) anymore. At first I suspected some sort of installation problem, since I’d just installed the new 8.1.1 version, and not long ago this used to work just fine. But for some unrelated reasons I uninstalled and reinstalled, and the problem was still the same.
In the URL for the application, I tried replacing
127.0.0.1 and I found that this solved the problem. I ping’d
localhost and I noticed that it was using
::1 as the address instead of
127.0.0.1 – the IPv6 equivalent. I looked in my hosts file and there was an entry in there, combining
localhost. Removing that entry solved the problem. Now, this is a bit weird… I actually thought I didn’t even IPv6 enabled at all, which would have explained why connections to
::1 wouldn’t work. But I found that it was in fact active – not sure why, I believe I switched it off at some point. Pinging
::1 is not a problem. So why can’t IE connect to it? And who added the
::1 entry to my hosts file recently? Or was it previously working in spite of that entry? Bizarre Windows network stuff… see, that’s why I still love Linux for my servers. Text based config files that never suddenly change automatically. Great.
I recently posted about this issue I was having with a weird delay when trying to print and I thought I’d found a solution to it. But now I’ve discovered that the solution was responsible for new problems… particularly when trying to print large files (not large by numbers of pages, but by content - big scanned images or things like that), I’m suddenly getting error messages every time about timeouts.
I haven’t spent too much time trying to fix this issue, but switching the printer port back to the RAW protocol provided an immediate workaround – the delay is back, but I can print whatever I want. Pretty interesting, as I’ve printed probably a hundred documents since I first switched to LPR. No idea where this comes from… I guess I’ll just switch the protocols selectively until I find time to hunt down a real solution. Update: I just noticed that even with the protocol set to RAW, it doesn’t work every time - it used to work just fine, no idea what happened…