Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Member Since 20 Nov 2005
Offline Last Active May 15 2015 12:30 PM

Posts I've Made

In Topic: "defer": is this silly / broken?

09 December 2014 - 03:20 PM

The first problem that comes to mind is that you're executing arbitrary code inside a destructor. If that code throws an exception during stack unwind then your program could turn into a big puddle of goo.

True that.

Did not even think about that as exceptions are not relevant for me. Exceptions are usually disabled in my projects (well, depends) - currently writing a system driver for some personal stuff. Exceptions are completely and utterly useless there:
* memory allocation cannot throw exceptions.
* all inputs must be validated all error conditions must be checked etc.
* there are no meaningful c++ exceptions.
* ... leaving only inherently unrecoverable exceptions via SEH (just catch the exception via first chance handler and try to teardown the driver - is the only sane thing i can do).

... but the point is valid. Speaking of which:

Without commenting on the potential technical pitfalls of your implementation...

that would actually be what i am most interested in. I am wary of using questionable new stuff i have come up with ... debugging bluescreens is not fun (especially as i do not have kernel debugging capabilities right now - it is just not worth it).

The only "advantage" your approach has is that lets you execute an arbitrary block of code defined at the scope's point in the source code. I'm not actually sure that's a good thing, though:

Neither am i, to be honest - "There be dragons".

"It suggests to me that the abstraction you are working with is poor; that the objects in question don't have properly written destructors or related cleanup semantics"
If it would be correct/convenient to use destructors of related objects - i would do that. My interest for "defer" came from run-once initialization functions where there were no meaningful objects for encapsulation. Nor convenient or meaningful ways to compartmentalize in functions ... which reminding me Go for an alternative.

PS. Do not confuse "interest" with "burning need" here - in case i am not being clear. The wast majority of time destructors/constructions/std-helpers are correct and/or convenient.

"It implies you have to violate the general recommendation to declare variables as close to their first use as possible in C++, if you want those variables to subject to the cleanup code that the "defer" mechanism provides, making for potentially sloppier code."
Sorry, could you rephrase that? I think i hit a language barrier (my English is actually fairly poor) - however many times i read it, i just can not follow. sad.png

"I think it hurts readability to put code outside it's execution order like that"
One of my golden rules: a system should try to prevent accidental mistakes and not intentional ones (without a bloody good and clear reason). One can easily write some unbelievably unreadable spaghetti using whatever language construct - or improve readability when using the same construct sensibly. So: yes and no. Blame the user and not the system.

"and I certainly find the macro hackery to introduce a new "syntax element" offensive."
Interesting. Is it because of macro usage in principle or because there is some legitimate problem with the macro (ie. opportunity for likely accidental mistakes)?

I can't really see a great use-case for this that isn't about cleanup code.

I agree. It is quite similar to try/finally, but not interchangeable from readability/use standpoint ("finally" block is not necessarily related to what is immediately before "try", whereas that is the only sane option for "defer" - ie. "defer" keeps the closely related code together).

You have seen it before. Anything that wraps a resource and destroys it when the wrapping object goes out of scope is effectively doing the same thing.

Which bit are you amazed by? Destructors? Function pointers?

You are being facetious. Or at least seem to.

One once said that "Whiles, procedures and case structures are mental masturbation. They're all compiled to a bunch of goto's anyway."

What i am talking about, as i said, is "defer" as in http://golangtutorials.blogspot.com/2011/06/control-structures-go-defer-statement.html in C++, wondering about its usefulness and most importantly - does it break somewhere? ie. Arbitrary finalization code without needing a special separately described object for the occasion and keeping the one-of closely related code together.

C++ does not have any comparable language construct (closest match would be try/finally [non-standard] and, less so, RAII). My facsimile seems to fit the bill.

In Topic: Why is math transformation taxing to most CPUs?

09 December 2014 - 10:30 AM

[...] and threads (2-8 as many operations per clock, if being unrealistically ideal) [...]

I nearly blew a fuse reading that. Please tell me that's a typo, and not how you think threading improves performance.

"if being unrealistically ideal".

His example was detailing the higher/upper bound and is correct as such. Reality of it is irrelevant in that context.

edit: or did you get the impression he is not talking about hardware threads (cpu cores and HT if available ... typically 2-8)?

In Topic: Windows 7 - icons shown with wrong size (folder thumbnail alternative).

09 December 2014 - 09:43 AM

Hehe. Yeah, have been the poor sod myself often enough to take the extra time/effort to post what i figured out :)

In Topic: Windows 7 - icons shown with wrong size (folder thumbnail alternative).

08 December 2014 - 10:30 AM

Had a epiphany while doing other stuff - and after digging around in documentation:
convert "%~n1%~x1" -alpha remove -colors 256 -bordercolor none -resize x128 -gravity center -crop 128x128+0+0 ^
  ^( -clone 0 -filter Box -resize 256x256 ^) ^
  -delete 0 -define png:format=png8 -compress Zip folder.ico
Windows is happy and i am happy smile.png (Not an imagemagick expert, so probably something unneeded there - but whatever. IT WORKS!)

... final filesize for 256x256-photo icon -> 12KB. No actual loss in quality compared to 256-color, but otherwise uncompressed, 256x256 ico.

In Topic: Windows 7 - icons shown with wrong size (folder thumbnail alternative).

08 December 2014 - 07:16 AM

I'm not sure what you're trying to achieve. I'm using Windows 7, and thumbnails work fine...

I am talking about FOLDER thumbnails. I use classic shell and they say that non-shit version does not exist in windows anymore and hence they can not do anything to help (Last post i remember encountering about it was something in the 2014 - doubt anything has changed. Would love to be proven wrong).

Random example from web: http://i51.tinypic.com/se8zmd.jpg

I thought this was a Windows 7 question but it seems to be more of an ImageMagick question. Have you seen this, about using the -define: command? ( -define icon:auto-resize )
I've used ImageMagick before, but not with .ico formats.

Yes, i suspect some quirk with imagemagick or windows that i am unable to track down. Example of what my script produces vs random windows stock icon: http://oi62.tinypic.com/2yzc56o.jpg

I think i found a workaround for the quirk (now it looks as it should: http://oi62.tinypic.com/1fz0h.jpg ). Somewhy windows demands that 256x256 version exists (it does not care about the other [more appropriate] ones - so, "icon:auto-resize" just makes the file bigger [just in case tested 256/128 options with it -> no benefit, just a bigger file and fails/works the same as only one size]).

So, a massive waste of space is required (i actually only need 128x128 or 96x96 [not sure which, tried both but only 256x256 works]). Am i still missing something or is there something i could do?

For completeness sake, i post the updated script that is invoked from context menu (also added icon cache flush for instant effect):
@echo off

convert "%~n1%~x1" -bordercolor none -resize x256 -gravity center -crop 256x256+0+0 -alpha remove -colors 256 "folder.ico"

attrib -s -h desktop.ini
del desktop.ini
  echo [ViewState]
  echo Mode=
  echo Vid=
  echo FolderType=Generic
  echo [.ShellClassInfo]
  echo IconResource=.\folder.ico,0
) > desktop.ini
attrib -a +s +h desktop.ini

ie4uinit.exe -ClearIconCache
"So the result: Windows XP uses 16, 32, 48-size icons, while Windows 7 (and presumably also Vista) also uses 256-size icons. All other intermediate icon sizes are ignored"
via http://stackoverflow.com/questions/3236115/which-icon-sizes-should-my-windows-applications-icon-include
Seems that windows is just trying to be funny and there is nothing one can do. Just have to accept the silly filesizes.

Wiki said ico can contain compressed png, so after digging in imagemagick documentation - i finally found "When writing an ICO file, you may request that the images be encoded in PNG format, by specifying Zip compression".
Indeed, windows accepts such a file ("-compress Zip") halving the filesize. Having seen the contents of the unpacked file - my gut feeling is that it could be packed more ... but no idea how to tell imagemagick that. Still, i am finally at the point where i am at least happy with the results.