In Donald Knuth's Literate Programming, there was if I remember correctly a graph showing the evolution of TeX's number of bugs over time. This graph has remained flat for the past decade or so, suggesting that TeX might now be bug-free.
I would like to use this graph to illustrate the importance of bug-tracking software. Is it downloadable from somewhere?
The graphs I think you are referring to are in chapter 10 of Literate Programming (Knuth, D. E., 1992, Center for the Study of Language and Information) which is a reprint from Knuth, D. E., 1989, The errors of TEX. Softw. Pract. Exper. 19, 7 (Jul. 1989), 607-685.
I have not seen the graphs other than in book form but an updated list of errors is in a PDF at http://tug.org/texlive/Contents/live/texmf-doc/doc/english/knuth/errata/errorlog.pdf. Whereas the list in chapter 11 of my copy of Literate Programming covers 1978 to 1991, the PDF extends this to 2002. If you have installed TeX Live another version of this file, up to 1995, will probably be on your system as knuth/errorlog.tex.gz.
A PDF of a note on the list is at http://www.tug.org/TUGboat/Articles/tb10-4/tb26knut.pdf, a TUG conference keynote address from 1989.
As all the errors are numbered it could be a quick manual process to produce a rough (as the list is not in strict numerical order) graph by, for example, month. The wider range and content of all the graphs in the chapter would be a longer undertaking but perhaps an interesting programming exercise (the format of the TeX source may be much easier than the PDF for this).
Side Note:
Code is never 'bug free'. There are only "Expected quirks" and "Bugs yet to be discovered".
Related
As the title says, I'd like to program a 3d game (probably a BattleZone clone), but without the use of an API like OpenGL, DirectX, and the like. At the heart of the matter, I'd just like to learn how to draw basic 3d shapes to the screen and manipulate them. Don't care if it looks like crap. I've used OpenGL to achieve similar ends before, but really didn't learn about these topics.
The problem is, I have no idea where to start. I downloaded the Doom source code, but it's a bit over my head. Although I've programmed a bit, graphical matters are very much out of my depth.
I'd be very grateful if anyone could offer links or code (in any language) that would help me along in my purpose.
Sounds like an exciting project. I did something similar in the late 90's. Before OpenGL and DirectX became popular, there were a ton of great books on the subject.
Fundamentally you will have to learn how to
Represent 3D geometry
Transform that geometry (translate and rotate)
Project that geometry onto a 2D screen.
Each of those major topics has many sub-topics (for example, complex objects can be constructed from a number of polygons. You may want to limit polygons to being constructed of triangles only, or support other polygons. You may want to load common model formats e.g. .obj files so that you can create models with off the shelf tools).
The topics are way too broad for a detailed answer here. Whole books are written on the subject, including
Black Art of 3D Game Programming (Book, amazingly still available)
For a good introduction to the general topics, have a look at:
http://en.wikipedia.org/wiki/3D_projection
http://en.wikipedia.org/wiki/Orthographic_projection
http://en.wikipedia.org/wiki/Transformation_matrix#Perspective_projection
Doom, which you already looked at, used a special optimization called heightfield rendering and does not allow for rendering of arbitrary 3D shapes (e.g., you will not find a bridge in Doom that you can walk under).
I have the second edition of Computer Graphics: Principles and Practice in C and it uses SRGP (Simple Raster Graphics Programming) and SIGGRAPH which is a wrap-around SRGP, if you look up articles and papers on graphics research you'll see that both these libraries are used a lot, and they are way more direct and low level than the APIs you mentioned. I'm having a hard time locating them, so if you do, please give a link. Note that the third edition is in WPF, so I cannot guarantee much as to it's usefulness, and I don't know if the second edition is still in print, but I have found numerous references to the book, and it's got it's own page in Wikipedia.
Another solution would be the Win32 API which again does not provide much in terms of rendering, but it is trivial to draw dots and lines onto a window. I have written a few tutorials on it, but I didn't cover drawing pixels and lines, so they'll only be useful if you have trouble with the basics of setting up a window. Note that it is not intended for real-time rendering, so it may get slow.
Finally you can look at X11 programming, the foundation of most modern operating systems with a GUI. I haven't found the libraries for Windows, but again I didn't invest too much time on it. I know it is available for CIGWIN and for Linux in general though, and I believe it would be very interesting to look at the core of graphics since you're already looking under the hood of 3D graphics.
I like to keep an eye on trending browsers/OSs/languages etc... I find google trends is a very useful resource sometimes but other times I can not get the information I want.
Example of very clear increase of Ubuntu (with 6 monthly peaks near release dates) compared with other major linux distros ecline over the years...
http://www.google.com/trends?q=ubuntu%2C+debian%2C+redhat%2C+mandrake&ctab=0&geo=all&date=all&sort=0
Example of results that are skewed because of non-programming related events. See "flash floods" and "earthquake in Java" in the news results
http://www.google.com/trends?q=flash%2C+java%2C+javascript&ctab=0&geo=all&date=all&sort=0
Is there a way to filter the results better to only include Java the programming language, and make sure that it catches all variations of a name - for example js instead of javascript, or an alternative tool that can produce similar graphical trend data.
It is possible to exclude terms with a minus sign and use | for variants:
flash -flood, java -crash -quake, javascript +js
But if you want accuracy it would be better to use the Language Popularity Index (or Tiobe as Bas suggests).
Such "metrics" have questionable value (but are fun to discuss). You could add a word like "program" to each language: http://www.google.com/trends?q=flash+program%2C+java+program%2C+javascript+program&ctab=0&geo=all&date=all&sort=1
Will you please provide me a reference to help me understand how scanline based rendering engines works?
I want to implement a 2D rendering engine which can support region-based clipping, basic shape drawing and filling with anti aliasing, and basic transformations (Perspective, Rotation, Scaling). I need algorithms which give priority to performance rather than quality because I want to implement it for embedded systems with no fpu.
I'm probably showing my age, but I still love my copy of Foley, Feiner, van Dam, and Hughes (The White Book).
Jim Blinn had a great column that's available as a book called Jim Blinn's Corner: A Trip Down the Graphics Pipeline.
Both of these are quited dated now, and aside from the principles of 3D geometry, they're not very useful for programming today's powerful pixel pushers.
OTOH, they're probably just perfect for an embedded environment with no GPU or FPU!
Here is a good series of articles by Chris Hecker that covers software rasterization:
http://chrishecker.com/Miscellaneous_Technical_Articles
And here is a site that talks about and includes code for a software rasterizer. It was written for a system that does not have an FPU (the GP2X) and includes source for a fixed point math library.
http://www.trenki.net
I'm not sure about the rest, but I can help you with fast scaling and 2D rotation for ARM (written in assembly language). Check out a demo:
http://www.modaco.com/content/smartphone-software-games/291993/bbgfx-2d-graphics-library-beta/
L.B.
resistors.org site and foxthompson.net download links are stale/broken.
http://www.resistors.org/index.php/The_SAM76_programming_language
Every other link I've been able to track down on the 'net (mostly in old newsgroup posts) are broken. E-mails to the respective webmasters all bounced.
I have a morbid curiosity for arcane programming languages, and SAM76 sounded really interesting to look into and mess around with.
There are quite a few lisp folks lurking on this site, so figured somebody might have a lead... as I heard SAM76 had some early redimentary functional programming ideas.
Extra credit: link to track down a copy of the SAM76 manual!
Wayback has a copy of S76.exe for DOS and Windows
http://web.archive.org/web/20070505122813/http://www.resistors.org/index.php/The_SAM76_programming_language
http://wikivisually.com/wiki/SAM76
http://encycl.opentopia.com/term/Sam76
http://encycl.opentopia.com/term/Algorithms_in_Sam76
======================= F R E E W A R E =======================
User-Supported Software
If you are using this program and find it to be of value
your $20 contribution will be appreciated.
A contribution of $30 will bring you the SAM76 language
manual and other useful and interesting documentation.
SAM76 Inc., Box 257 RR1
Pennington, N.J., 08534
U.S.A.
Regardless of whether you make a contribution,
you are encouraged to copy and share this program.
> ---------------------------------------------------
http://web.archive.org/web/20110726163455/http://www.hypernews.org/HyperNews/get/computing/lang-list/2/2/1.html
I believe the R.E.S.I.S.T.O.R.s (have no idea what the letters
mean) was a group of kids who played with computers and
electronics in Claude Kagan's barn in Pennington, N.J. near
Princeton. Because the developer of TRAC, Calvin Mooers,
spent the rest of his life inventing the software patent and
sued everyone in sight, Claude (whose employer, Western
Electric Laboratories was sued by Mooers) created a very
similar language called "SAM76" supposedly based on S7 and M6
"languages from Bell Labs". I have the original tutorial
manual written and illustrated by the R.E.S.... and versions
on paper tape for the Altair and TRS-80 floppy disk. I think
it looked more like #os#is;; but you could change all the
special characters and command names so it could be made to
look EXACTLY like TRAC. Claude wrote some neat graphic games
for the TRS-80 in SAM76/TRAC.
http://web.archive.org/web/20110726163335/http://www.hypernews.org/HyperNews/get/computing/lang-list/2/2/1/3.html
Yes, we RESISTORS did indeed meet in Claude's barn which was filled with old telephone and computer equipment. Claude's version of TRAC started on the PDP-8, migrated to the PDP-10, and for the legal reasons mentioned ended up as SAM-76. (FYI, SAM stands either for "Strachey and McIlroy" or "Same As Mooers". RESISTORS always stood for "Radically Emphatic Students Interested in Science, Technology, and Other Research Studies" as much as it stood for anything.
Starting when we were members of the RESISTORS, Peter Eichenberger and I wrote a PDP-10 TRAC processor and later reimplemented it for the PDP-11, eventually adding a little multi-terminal time-sharing monitor. We kept a lower profile than Western Electric (either that, or as 19 year olds we had no noticable assets) so we and Mooers stayed on cordial terms.
I don't know if this is useful, but on this page there is an email adress dsf#hci.ucsd.edu which seems to be Dave Fox's one, the guy who maintained the page hosting the SAM76 file.
There's a pile of information in the old SIMTEL archives, specifically CPMUG Volume 34, which is included in the nearly 13G download here including example code. You have your choice of "DSK" and "ARK" (ARC) format images. The standard {file} utility knows what format it's in {CPMUG034.ARC: ARC archive data, dynamic LZW} SIG/M v. 53 also has SAM76 information and you can find it here.
My custom homebrew photography processing software, running on 64 bit Linux/GNU, writes out PNG and TIFF files. These are to be sent to a quality printing shop to be made into fine art. Working with interior designers - it's important to get the colors just right!
The print shops usually have no trouble with TIFF and PNGs made from commercial software such as Photoshop. Even though i have the TIFF 6.0 specs, PNG specs, and other info in hand, it is not clear how to include color calibration data or implement color management system on linux. My files are often rejected as faulty, without sufficient error reports to make fixes.
This has been a nasty problem for a while for many. Even my contacts at the Hollywood postproduction studios are struggling with this issue. One studio even wanted to hire me to take care of their color calibration, thinking i was the expert - but no, i am just as blind and lost as everyone!
Does anyone know of good code examples, detailed technical information, or have any other enlightenment? Or time to switch to pure Apple?
Take a look at LittleCMS
http://www.littlecms.com/
This page has the code for applying it to TIFF
http://www.littlecms.com/newutils.htm
The basic thing you need to know is that Color profile data is something you need to store in the meta-data of the file itself.
There is a consultant called Charles Poynton who specialises in this area. I work for one of the post production studios you mention (albeit in london not hollywood), and have seen him speak on the subject a couple of times. His website contains a lot of the material he presents and you might find something of use there. He also has a book called Digital Video and HDTV Algorithms and Interfaces which is not as heavy as the title might suggest! While these resources might not answer your question directly, it might provide a spring board to other solutions.
More specifically, which libraries are you using to write the png and tif files - you mention they are homebrew, but how custom are they exactly? Postprocessing the images in an image manipulation program (such as ImageMagick or dcraw) might allow you to inject this information into the header more successfully.
Sorry, I don't have any specific answers, but maybe something that will point you a bit further in the right direction...
As a GNU/Linux user, you’ll want to consider DispcalGUI – http://dispcalgui.hoech.net/ – a GNOME-based GUI that centralizes color management, ICC profile management, and (crucially for your case) device calibration. It can talk to well-known pro- and mid-level hardware, e.g, i1, X-Rite, Spyder, etc.
But before you get into that – you say you are generating your files to spec; are you validating your output using a test suite specific to the format in question? If not, here are three to get you started:
imagetestsuite supports the well-known formats: https://code.google.com/p/imagetestsuite/w/list?can=1&q=
The Luminous* test suite is a JIRA plugin, if that’s your thing: https://marketplace.atlassian.com/plugins/com.luminouslead.plugin.jira.testsuite.LuminousTestSuite
FLOSS Decoder implementations often have one you can use, i.e. OpenJPEG – https://code.google.com/p/openjpeg/wiki/TestSuiteDocumentation
But even barring all of those, it seems like your problem is with embedded ICC data – which is two specs in one. First, there’s the host image-file format, and they all handle embedding differently (meaning the ICC data will likely look totally different when embedded in a TIFF than, say, a JPEG or WebP file). Second, there is the ICC spec itself. It is documented here: http://color.org/v4spec.xalter – and you may also want to look at the source for the aforementioned dispcalGUI, which includes a very legible and hackable ICC profile class in Python: http://sourceforge.net/p/dispcalgui/code/HEAD/tree/trunk/dispcalGUI/ICCProfile.py
Full disclosure: I have contributed to that very ICC profile class, to which I just linked in that last ¶
That’s the basics (many of which you have no doubt covered)... beyond that, if you post more information about what exactly is going wrong, I’d be interested to look it over. Good luck with it either way.
* NB. This project is unrelated to the long-standing photography website, “the Luminous Landscape”