# Language Debate



## Amaroq (Nov 1, 2007)

What language(s) do you prefer to program in? What makes those good languages?

I'm going to be making two topics for this. This one is more for actual PC programming, and the other one will be for web development.

I know topics like this can get really heated, so try to remember this folks: we're all fellow furries, not enemies!


----------



## Rhainor (Nov 1, 2007)

I'm by no means a programmer, but I've heard good things about Python, and bad things about C++ and Perl.

Feel free to disregard my statements, though, as I am highly uninformed in these matters.


----------



## Amaroq (Nov 1, 2007)

Python is a good language, especially for beginners. It forces indentation when it comes to things such as defining functions, loops, etc.

I still only have a basic understanding of C++ and don't know any Python at all (and have never even seen Perl code before).

C++ Isn't a bad language though. It's a lower level language, so it's obviously more complex. But on the other hand, you do have greater control over what exactly is going on and, I believe, you can make more memory efficient applications in it.


----------



## yak (Nov 1, 2007)

C++ :  
While i'm nobody's fanboy, i do agree with Linus a little, http://emonk.debianuruguay.org/?p=42. 
Object oriented language taken way too far, and with poor design choices at certain points/implementations. 

Perl: 
From what i understand, the language was written to work with large amounts of text efficiently, and PCRE is a proof hard to deny. A good addition to your arsenal if you have to admin/rationalize stuff on servers/workstations.

Python: 
I don't know the primary reason for which this language was written. But i do like the way it enforces standarts of code indentation.

Java:
Hmm, i have mixed feelings about it.
It's a very old language with tried and true concepts, but just like any other language these days, taken way too far. There has to be a reason why a lot of corporat entities prefer it for their critical stability software. Has a very good error/exception control, and inspired many later languages to adopt a similar approach.
Aims to be a cross-platform language, and indeed it is. Used widely on mobile devices these days, and if it wasn't for it, who knows in what way mobile stuff would have developed.
Though it also is quite resource-heavy, which... well, doesn't suit the requirements of a lot of fields of use.

.NET:
Well, what can i say, to me it's a Java replacement language. More efficient with it's cross-platform bytecode and easier to develop in. Adopted semantics and approaches from all the languges in one way or another. With the advancement on the Mono *nix project, gains more and more populatiry these days.
Used for desktop application and web development alike. Heck, some of the apps you use are written in it and you don't even know it. 

Ruby/Rails:
There's so much fanboyism surrounding it it's not funny. 
Don't have any personal experience with it, and am tired of hearing how cool it is here and there.


----------



## net-cat (Nov 1, 2007)

It really depends on what you're doing.

For low level system programming, C and Assembler are your best bets. For performance, Assembler. For portability, C. (It's also fairly easy to mix the two. Most C compilers can inline assembler and it's not hard to call C functions from assembler.)

For higher level application programming, C or C++. It largely depends on your needs. If you're working with lots of abstract data types, C++. If not C will suffice.

Then there's the bytecode languages. Java and the .NET collection. The only Java application I've ever seen that wasn't a steaming heap of shit is Azureus, and even that has issues. .NET applications tend to stand up a little better. I've never really done much with either, so I can't speak to the quality of the languages themselves. (.NET 3.0 was meant to replace the WIN32API in Vista, but they didn't get around to it, I guess.)


----------



## Amaroq (Nov 1, 2007)

I haven't really written much in C++ and don't really know a lot about C, so don't hold me to any arguments I make here. But I do like C++ better just because of certain things I've read and certain features I have seen.

Say you're working for a gigantic company and have to make something really big. You're working with an entire team of developers and individuals and/or different groups of people are making modules separately, once their interactions have been planned out. Before long, there's tons of functions, variables, etc and you never know what the other people are naming the stuff in their pieces of the program. It quickly becomes tedious to go back and forth and make sure everybody is using unique names for everything. Enter C++'s namespaces. You don't have to worry about that anymore. Every important section of the program can have its own namespace, and the developers are free to declare their vars and functions without having to consult every other member of the development team.

C++ also has better typechecking. One big example comes to mind of a rather strange thing you're allowed to do in C that C++ would never let you do:

int f()
{
 return a + b;
}

Now in C++, defining a function with no parameters means that the function takes none. In C, however, defining a function like that means that the function can take any number of arguments, of any type, which makes typechecking of such a function impossible.


----------



## net-cat (Nov 1, 2007)

That's how printf et al work...

It occurs to me that if someone goes through college, learning whatever language they teach in and come out unable to learn another language, they've missed the point. They should go get their money back.


----------



## Amaroq (Nov 1, 2007)

net-cat said:
			
		

> That's how printf et al work...


I must know less about C++ than I thought. I never thought about that, since I never use printf. (I don't even like printf.) I could've sworn an e-book I have said that parameter-less C++ function declarations mean the function takes no arguments. If that's true, a printf would either have to have a limited number of allowed arguments in C++, or it would have to somehow dynamically increase its number of arguments depending on how many are fed into it...

I'll have to look at the source code of printf sometime to see how it works.


----------



## net-cat (Nov 1, 2007)

Actually, you're right.


```
void whatever();
```
No arguments in C++, arbitrary arguments in C.

printf actually uses the "..." construct in both languages.

```
int printf ( const char * restrict format, ... );
```


----------



## Amaroq (Nov 1, 2007)

Ah, that's neat! I've only heard of the "..." being used in case ranges.


----------



## Rostam The Grey (Nov 1, 2007)

Depends on what I'm trying to do. Most stuff can be done quickly and easily in VB 6.0. But I prefer to work with C#. I give the client the choice by giving them the pros and cons of each.


----------



## Eevee (Nov 1, 2007)

Hm.  How many can I name...

*C*
Good, solid, blazing fast language.  Wouldn't touch it with a ten-foot pole if I didn't have to; I haven't worked with it nearly enough to not shoot myself in the foot repeatedly and I enjoy higher-level concepts like, you know, _dynamic arrays_.  I wouldn't really recommend it for a project unless the speed is necessary throughout or the author (and everyone else who may see the source) is extremely comfortable with it, but it's very good to know either way.

*C++*
OO bolted onto C is cool.  Templates are slow garbage that I've rarely found I need.  The STL is likewise slower garbage that I've never found I need.  iostream is a pathetic attempt at type safety that unfortunately bloats the executable, wrecks the bit-shift operators, and turns formatting into a hellish nightmare.  (I find it immensely entertaining that even Python uses printf syntax extensively.)  Not a bad language if you pretend it's C with more formal OO, but a lot of what they added is slow, confusing, unnecessary, and downright tacky.

*D*
I only spent a little time reading about D, but it seems to be at least a marginal improvement over C, and I guess if I were going to write something from scratch in C I'd take a look at this first.  I recall built-in hashes and something vaguely smarter about arrays, but not much more; it's been a while.

*Assembly*
Intriguing, although I don't really know enough to actually write anything useful.  I'm kinda itching to have a reason to use assembly, just because it's almost more of a puzzle game than just program-writing.  Not something I'd use on a daily basis, though, although I've known people who would.

*Perl*
Solid language, my best-known and possibly favorite, but to be fair I happen to agree with Larry and Damian on almost everything.  Has its quirks and is certainly not for everyone -- especially given that all but the most well-regarded resources teach the language abysmally -- but is fairly consistent with itself once you know what on earth it's doing.  Documentation is excellent, runs on anything, CPAN is a dream, community is excellent and brilliant, fairly speedy as interpreted languages go, Unicode support is great, good for everything from a sed/awk/bash replacement to administration to automation to Web application building.  It can do GUI and interface with SDL too, but given that there is no longer a "compiler", writing distributable apps is questionable.

*Python*
Solid language, although a lot of the design decisions made by Guido come down to an enforcement of his style upon everyone else, and that grates my nerves a teeny bit.  Library is excellent, as is the plethora of additional modules available.  This and py2exe makes it suitable for pretty much anything.  I can hardly name any languages features that aren't in Python, save for a couple that are in Perl 6 and thus not actually in any language yet.

*Ruby*
Haven't used it, but have done enough reading, and the only complaint I can fathom is that it's slow -- which will supposedly be fixed this winter.  Looking for an excuse to actually use it for something so I have more to say, but by all accounts it appears to be a great language  :V

*Perl 6*
Will cure cancer.  Will also make anyone who hates Perl 5 cry.  Hyperoperators, junctions with automatic parallel processing, ten times the power of Perl's current regex engine, and a general cleanup of all sorts of Perl 5 rough edges.  I'm looking to learn Haskell just in the hope that I can help with the current interpreter.

*Haskell*
Only know bits so far, but as my first _real_ foray into functional programming it's certainly interesting.  Given my lack of experience even within the paradigm it's a bit hard to say much.  

*LISP*
Classic functional programming.  I cheated; I've only ever used Scheme, and only once to write a GIMP 1.3 plugin, and I cheated to get an iterative loop.  Dig the syntax and flexibility; don't so much dig the parentheses everywhere.

*JavaScript*
Prototyped semi-functional language with first-class functions.  There are some irritating limitations to the syntax (no negative indexes on arrays, for obvious reasons), but the language itself is deeply interesting, not least of all because such a function-oriented design became the de facto client-side browser scripting language.  The lack of real arrays would be gross, but given that everything in JavaScript is consistently either a function or a hash (and that until recently there were few array operations anyway), having arrays act as both is reasonable.

*PHP*
lol

*BASIC*
Eh.  Unnecessarily verbose syntax, screwy type system, not many powerful language features, no sugar.  Later editions of VB.NET attempt to crowbar more powerful syntax in, but at that point you might as well just use another language.

*Java*
Slow startup time, slow execution time (especially with anything using Swing or AWT), a type system inspired by C complete with the required hack-arounds like working with void pointers, and very few benefits one would expect from using a not-really-quite-compiled language.  Lame.

*.NET*
Not actually a language, but I will mention it anyway.  Screw .NET.  I don't trust Microsoft with my languages, I don't want to write in something that's cross-platform the same sort of way WINE makes everything cross-platform, the framework is astoundingly slow to start up the simplest apps, and everything is so incredibly verbose that I can hardly believe anyone ever gets anything done.  Nice try, but I'll hold out for Parrot.

*bash*
Impressive how an actual language has evolved from a single interface for entering commands and shuffling the output around, but the syntax for basic operations amongst all the *shes is so ridiculously unintuitive and inconsistent that I'd really rather just use Perl or Python and not have to worry about it.

*mIRCscript*
What a fucking abomination.  Similar to what happened with bash, except someone decided that writing some language hooks would be way too damn hard, so the very simple notion of IRC slash commands has become this twisted wreck where everything is a string except when it's not and everything is done by $ interpolation.  Not that it's ever used for anything decent, anyway.  I've seen someone write a bot entirely in mIRC that fetches Pokedex pages from veekun on demand and screenscrapes them for information.  Seriously, at that point, why not just write a standalone bot in something else?  IRC modules are not hard to find.

I AM RUNNING LOW ON RELEVANT PROGRAMMING LANGUAGES, IF IT WERE NOT IMMEDIATELY OBVIOUS

*sed*
s/// is cool syntax, and I thank you for that, but anything beyond a single search/replace that occurs all in one line requires mental gymnastics of astounding proportions.  I usually stare at the manual for half an hour, then get annoyed, wonder why the hell I'm using sed, and spend thirty seconds doing what I wanted in Perl in one line.

*awk*
awk is cool.  Not super-powerful, and not going to run any 3D apps, but awk is..  cool.  Yeah.  Try awk.  It's like proto-Perl.

*SQL*  (I claim it to be a programming language because it can loop, dammit.)
I've heard SQL compared to database assembly, but I think I disagree.  SQL is more like mystic incantations that have equal chances of granting you immortality and raping your parents -- but either way, you're out all the ingredients.  Still, it's fun trying to get what you want out of it...  at least, for the first hour or so.

*Lua*
Another very function-oriented language; it seems deceptively simple and useful, but I haven't had any real chance to work with it.  (I considered modding for WoW for a while, but the absolute lack of documentation was infuriating, and then I got bored with the game anyway.)  Would like to use it sometime.

OK GOING ESOTERIC

*brainfuck*
Awesome.  I'm considering writing some Web apps in bf, just because it's so awesome that almost any calculation can be done with just eight operators.  Everyone should have to write at least one bf program.  And then read it a week later.

*Malbolge*
I've read the documentation at least half a dozen times and I still don't know how it works or how to do anything in it.

*Befunge*
I actually wrote some stuff in this when I first heard of it.  Stack-based language where everything is one character and laid out on a grid; probably the sanest language on this sublist.  Code execution starts in the first character of the first line proceeding right.  Various operators can (sometimes conditionally) change the direction of code execution.  There's also a technically-possible 3D (or n-D) Befunge, although that would take a lot more operators and be rather difficult to actually write.

*INTERCAL*
Hilarious and awful at the same time, although for obvious reasons I didn't spend too much time reading about it.

*Java2K*
Clever, but doesn't seem to really be fleshed out enough to even attempt to write interesting programs.  Still, I love the concept.

*lolcode*
Teehee it's like those silly cat pictures.  Stupid.


Yeah okay I'm done.

Summary: Perl is awesome, Python is fairly awesome, most languages suck.

I'd like to really learn Ruby, Haskell, Lisp, and Smalltalk, but I don't have anything to actually _write_ in them, and that's the best way I learn anything.


----------



## js58 (Nov 1, 2007)

I like assembly mostly. If you have an assembler with decent macro support, writing maintainable programs is about as doable as in C, albeit in the shorter term. But I like C as well. I thought Pascal was a good basic language too, but lacks library and its syntax is bulky; it really has nothing on C other than being slightly more readable. 

I've done a lot of Java. In my opinion, it's a good standard and is reliable, but performance is still a problem. They also tend to bloat the standard libraries a little too much for my tastes, and sometimes the language itself (suddenly, generics!). And as far as I know, it doesn't communicate with other languages much. Still, its OOP is much more desirable than that of C++, and welcome. I dislike C++ because its OOP implementation is obscure, yet forces you to know the exact underlying mechanisms to prevent bugs and bad class usage, constantly. OOP is detrimental if it can't abstract properly. The only reason I would use C++ would be for performance. For low-level access in OOP, I would much rather have a dedicated OOP language capable of doing inline or linking with a lower-level language, though I can't think of any right now, than having a half-assed OOP interface thrown over a low-level language.

Delphi (OOP Pascal) was fun for making 2D Windows jumping games, but I can't imagine it being used seriously.


----------



## Eevee (Nov 1, 2007)

I can't stand how much effort (or typing) Java requires to do everything.  It seems like a gigantic waste of my time.

We really need a new compiled language with a robust OO system.  Glancing over D again, I wonder if it might be a little better-suited for the role than C++ ever will be.

My experience with GUI Delphi apps has been..  well, that they are about as responsive as Java ones.  I only know about the actual syntax what little I have gleaned from brief descriptions of the language; it doesn't seem anything too terrible, just a bit antiquated and nearly forgotten at this point.


----------



## Pi (Nov 1, 2007)

I have love for 6 languages and hate almost everything else.
Ruby: Pure OO
Common Lisp: Pure macro expansion and malleable syntax makes for one hell of a language
Haskell: Pure functional programming
Erlang: Trivial concurrency and almost-pure logic.
Perl: Pure dirty evolution
Forth: Pure stack manipulation

Languages I'd like to like:
Smalltalk: Pure OO, nasty VM
C#: Pure OO, nasty VM
Prolog: Pure logic, but almost goes slightly over my head.
F#: Suffers from the same VM problem as all .NET languages, and the syntax is kind of weird.

Languages that get a fuck you:
Scheme: You took common lisp and ripped the unhygenic macro feature out of it. Fuck you.
PHP: Eevee's already covered this. Fuck you.
Python: Brain-damaged developer community thinks that all the arbitrary restrictions (lambda can only take one expression, what the fuck?) are ACTUALLY strong points in disguise. Fuck you.
Java: We pretend to be OO and statically typed but are wishy-washy about it! Until Generics came out you had to cast things from Objects. No operator overloading lets people write stupid mistakes like bigInteger1 == bigInteger2 and requires that we write bint1.addTo(bint2.divideBy(bint3.multiplyBy(bint4).subtractFrom(bint5))), but strings inexplicably concatentate with +. Fuck you.

Also, to everyone who says "assembly" is a good language, fuck you. Assembly is not a language until you tell us which CPU you're writing it for. And since you're all using x86 hardware, no, your assembly language is not good. Go write some MIPS assembly and come back when your mind has blown.


----------



## Pi (Nov 1, 2007)

Amaroq said:
			
		

> C++ also has better typechecking.



Just as a note, you don't get to talk about static/strong type-checking until you've used Haskell or ML. Come back when you've stopped drinking apple juice from the strong-typing equivalent of a sippy-cup, and had some wine from the Haskell strong-typing elegantly-stemmed wine-glass



			
				Amaroq said:
			
		

> One big example comes to mind of a rather strange thing you're allowed to do in C that C++ would never let you do:
> 
> int f()
> {
> ...



This is invalid C anyway, because you never declared a or b. Also, I think that behavior is in the spec, and might have something to do with pre-ANSI, I think.


----------



## js58 (Nov 2, 2007)

> I can't stand how much effort (or typing) Java requires to do everything. It seems like a gigantic waste of my time.



It depends on what exactly you're referring to... Because I find the language is clear enough that I save a lot of time on comments, conversely. And it has a very large library, so a lot is already done for you. Not to mention automatic garbage collection vs C/C++. Heck, I usually save time when I find I can code something in Java, and I think a lot of the employers around here who've conceded to it would agree.



> We really need a new compiled language with a robust OO system. Glancing over D again, I wonder if it might be a little better-suited for the role than C++ ever will be.



Personally, I think Java has a decent OO system at this point, but I'd agree we need a new language if only to replace C++ in the performance department.



> Also, to everyone who says "assembly" is a good language, fuck you



Bitter much?

I find assembly is a "good language" in the sense that it lets you do what you need to do when you know what you're doing and other languages will only get it the way. Getting around walls of type checking gets to be a pain when most of what you're doing is low-level. Even in C. Sometimes you just have to MOV it, quick and dirty.

But I find it depends on the assembler as well. Macros are a huge aid. I like FASM myself.


----------



## Pi (Nov 2, 2007)

js58 said:
			
		

> > Also, to everyone who says "assembly" is a good language, fuck you
> 
> 
> 
> I'm reading the first sentence of your reply and making a well-thought-out response to entirely the wrong problem.



Fixed it for you. Try reading the entire post next time.


----------



## Eevee (Nov 2, 2007)

js58 said:
			
		

> It depends on what exactly you're referring to... Because I find the language is clear enough that I save a lot of time on comments, conversely.


Casts to/from Object for something as mundane as db access and using methods for arithmetic are clear?



			
				js58 said:
			
		

> And it has a very large library, so a lot is already done for you.


This also applies to C.  And C++.  And Perl.  And Python.  And, hell, even PHP.  Probably half my list.



			
				js58 said:
			
		

> Not to mention automatic garbage collection vs C/C++.


I've seen people work miracles with garbage collection in C without too much trouble, and never have to worry about it again.


----------



## js58 (Nov 2, 2007)

Pi said:
			
		

> Fixed it for you. Try reading the entire post next time.



"And since you're all using x86 hardware"

I assumed you would stick to your own assumptions. Obviously I wouldn't have responded if I wasn't talking about x86. And FASM is x86, FYI.

In any case, "no, your assembly language is not good" does little to convince me, sorry. Neither does "you ain't seen nothin' 'til you've seen xxxxxx". Language Debate thread, please?



> Casts to/from Object for something as mundane as db access and using methods for arithmetic are clear?



Db access is lame for sure, but although being lengthy, I don't see what's so unclear about methods for arithmetic where it's absolutely necessary. It's lame, but unclear?



> This also applies to C. And C++. And Perl. And Python. And, hell, even PHP. Probably half my list.



Except C/C++ libraries are horrible to use and prone to misuse, as has already been said. Java's are consistent and actively developed, even though sometimes bloated. Can't comment on the others.



> I've seen people work miracles with garbage collection in C without too much trouble, and never have to worry about it again.



I would like to see that. But I have trouble believing it is a widespread phenomena that C programmers in general produce only as many memory leaks as Java programmers. In the end, it is simply one less bit of trivia to bother small application developers.


----------



## Pi (Nov 2, 2007)

js58 said:
			
		

> Pi said:
> 
> 
> 
> ...



okay, so an architecture with no general-purpose registers (it lies about having 4 of them, because certain registers are affected in obscure ways by equally-obscure instructions), a segmented architecture (which fell out-of-vogue shortly after people realized that it was stupid), a stack-based floating point unit (which fell out-of-vogue shortly after people realized that it was stupid), an inconsistent instruction format, seemingly arbitrary restrictions placed on the operations you can do (certain instructions only source data from one register, or only store data in one register, or can't have immediate values, or can't access memory) is somehow GOOD?



			
				js58 said:
			
		

> Neither does "you ain't seen nothin' 'til you've seen xxxxxx". Language Debate thread, please?


Fine. Remain unconvinced and steeped in what amounts to blind ignorance. I don't really care. 

Just don't be surprised when you find that you can't debate about a feature you've never really used in its purest form.


----------



## js58 (Nov 2, 2007)

Pi said:
			
		

> okay, so an architecture with no general-purpose registers (it lies about having 4 of them, because certain registers are affected in obscure ways by equally-obscure instructions), a segmented architecture (which fell out-of-vogue shortly after people realized that it was stupid), a stack-based floating point unit (which fell out-of-vogue shortly after people realized that it was stupid), an inconsistent instruction format, seemingly arbitrary restrictions placed on the operations you can do (certain instructions only source data from one register, or only store data in one register, or can't have immediate values, or can't access memory) is somehow GOOD?



No, no, I'm not trying to defend every decision Intel made regarding their processor limitations in the past decade or two; I hate the FPU stack as much as anyone, and I'd rather do those calculations in C if I can, for example. But the x86 ASM syntax is straightforward and efficient, and it fully exploits what the processor _can_ do, i.e. indexing, combined operations, etc., procedurally and without ambiguity, making plenty of low-level ops very easy to code. No, not all, but most. I don't know where you get the idea that the instructions are obscure; they're very rigidly and clearly documented, as all 32-bit code has been since the 386. You have to learn the language before you code in it!!! In any case, processor limitations are caught upon assembly, these are known constraints, they're straightforward, and they're like any other rules a language might have. And there are more than enough variants to easily get around any of them without that much more code.

In any case, many of the restrictions you list have no effect when programming Win32 PE files, for example. At this point in time, you can skip the FPU stack with SSE2 floating-point instructions, memory is treated as flat (no segments), and you can use most registers for most basic operations and combine them. All exceptions are documented strictly. It takes 2 seconds to look up. And if you're not happy with it, you just put it in a macro. An inconsistent binary instruction format hardly matters with an assembler!

If x86 ASM were as horrible as you describe, there's no way I would have chosen it over C/Pascal for over half my programs (no stupid jokes, please).


----------



## Eevee (Nov 2, 2007)

js58 said:
			
		

> Db access is lame for sure, but although being lengthy, I don't see what's so unclear about methods for arithmetic where it's absolutely necessary. It's lame, but unclear?


I don't know exactly how this works, but I envision:

(b.neg().add(b.pow(2).sub(a.mulBy(c).mulBy(4)).sqrt())).divBy(a.multBy(2))

I would certainly call that unclear.



			
				js58 said:
			
		

> Except C/C++ libraries are horrible to use and prone to misuse, as has already been said.


I didn't say anything about C libraries; I have never had a problem with them.  They tend to be simple and straight to the point.

The only problem I mentioned with C++ libraries is that the _built-ins_ suck.  Of course, given that "C++ library" implies reliance on C++ features and most C++ features are terrible, you may have a point there.



			
				js58 said:
			
		

> But I have trouble believing it is a widespread phenomena that C programmers in general produce only as many memory leaks as Java programmers.


Oh, of course not; no more than C programmers produce as many null pointer dereferences as Java programmers.



			
				js58 said:
			
		

> In the end, it is simply one less bit of trivia to bother small application developers.


Small application developers probably don't need to be using C _or_ Java.


----------



## Paul Revere (Nov 2, 2007)

I don't have much experience in HTML or PHP or any of that internet stuff.  I don't even know much about what it means.  But for application programming, I'd say the best platform is Visual C++ 6.0 Enterprise Edition.  I don't like .NET because you can decompile the executable into it's source code, and programs made with Visual Studio 6 run on more computers.

And from what I understand, VC++ 6 is the standard in the video game industry.  Don't quote me on that, but it makes sense.  Writing a video game with .NET would be a little awkward, imo ...

EDIT: In other words, ALL BOW TO TEH MIGHTY VC++ 6.0!


----------



## Xenofur (Nov 2, 2007)

Something being "the standard" doesn't necessarily mean it's also good. A few years ago half the world was buying the shit produced by Britney Spears as well. 

Have to agree though that VC6 was pretty decent as an IDE, unless you tried to use anything MFC.


----------



## Eevee (Nov 2, 2007)

Paul Revere said:
			
		

> But for application programming, I'd say the best platform is Visual C++ 6.0 Enterprise Edition.


Oh, man.  Oldschool.  Hardcore.

I can't really stand Microsoft's IDEs, though, so I'd just as much avoid paying them out the ass and use Makefiles.  :V  C++ compilers are not difficult to find.



			
				Paul Revere said:
			
		

> I don't like .NET because you can decompile the executable into it's source code


Well.  I'm no fan of .NET, but this isn't really true.  You can get something that certainly _looks_ like source code, but if it's anything like the Java decompilers, it will only be moderately more useful than decompiled C.

Also I'm one of those hippies who begrudgingly admits Stallman is right about everything despite being a raving lunatic, so your source's not being airtight doesn't earn all that much sympathy from me.


----------



## js58 (Nov 2, 2007)

> I don't know exactly how this works, but I envision:
> 
> (b.neg().add(b.pow(2).sub(a.mulByÂ©.mulBy(4)).sqrt())).divBy(a.multBy(2))
> 
> I would certainly call that unclear.



Okay, but that's only for big integers. You can use all the normal arithmetic operators on integers (32-bit) and longs (64-bit). Do you really need to use numbers greater than 64-bit that often? Wouldn't you then need wrapper functions/classes in C/C++ anyway?



> I didn't say anything about C libraries; I have never had a problem with them. They tend to be simple and straight to the point.



Sorry, I typed C/C++ automatically when I wanted to type C++. However, I find C's libraries rather lacking myself, particularly regarding string handling. Way too easy to cause overflows. On the other hand, strings are handled almost automatically in Java. I find the library also rather small.



> Oh, of course not; no more than C programmers produce as many null pointer dereferences as Java programmers.



I wasn't talking about null pointers (although, speaking of which, they are much better handled in Java with the its nice exception handling, whether you're in debug mode or not). I was talking about memory leaks. Memory that hasn't been properly deallocated.



> Small application developers probably don't need to be using C or Java.



C, probably not, but Java? It's an internet-based language, meant for mobility, applets, client appls, etc.; I think it's pretty clear it was meant for small- to medium-size applications. And it's general-purpose! Sure you can probably find me a language specifically suited for this and that, but Java accommodates generic programs very well without having to spend time and resources learning every new language that comes along. And it's very easy to learn itself.

Frankly, I'm _not_ a huge Java fan myself, and I don't think it's perfect. Heck, I absolutely hated the VM idea in the first place and refused it for some time. But I believe that, above all, at the moment, it is one of the better standards quite simply because it permits you do write generic applications with minimal debugging time and maximum readability. It gets rid of a lot of the annoying low-level details in C/C++ and lets you focus on your OOP model. Reusability. It promotes it. That's why I'm inclined to defend it somewhat. I'll keep using it until I learn something better that promotes all of these concepts just as well and just as generally. I am interested in C#/.NET, but I haven't gotten around to it yet.



> Something being "the standard" doesn't necessarily mean it's also good. A few years ago half the world was buying the shit produced by Britney Spears as well.



Oh I agree in the general sense, but I'd assume if you were building an application that's to be maintained over a few years, i.e., as an employer, you'd want a language that's standard enough so that every time you hire somebody, you don't want them to start re-learning the language or some obscure coding style from scratch, unless the task is really that specific.

In this case, I wouldn't have accepted Java just because "it's the standard". I can accept Java because it's a standard and it works well, generally, and it's productive and maintainable, after you get past the nitpicking. Heck, I might get a job because of it.


----------



## Pi (Nov 2, 2007)

js58 said:
			
		

> > I don't know exactly how this works, but I envision:
> >
> > (b.neg().add(b.pow(2).sub(a.mulByÂ©.mulBy(4)).sqrt())).divBy(a.multBy(2))
> >
> ...



Why treat big integers any differently? This is OO. The entire point of an object-oriented language is to abstract things into a representation that we don't care about the underlying types.

In C, yeah, you'd need to use something gross like the GMP library and write negate(add(pow(b,2),sub(mul(mul(a,b),c) yaddita yaddita. But if you're using C++ (even with it's otherwise shit OO model), hey, overloaded operators happen and you can write things that look like they make sense. Which is why I say that Java's not an OO language, because it doesn't provide these tools. Ruby automatically promotes Fixnums into Bignums when they get too big, and you can add Fixnums to Bignums without caring how it's stored.


----------



## Eevee (Nov 2, 2007)

js58 said:
			
		

> Okay, but that's only for big integers. You can use all the normal arithmetic operators on integers (32-bit) and longs (64-bit).


Oh good, so I have to use an entirely different paradigm depending on the _number of digits I need_.  How very OO.



			
				js58 said:
			
		

> Wouldn't you then need wrapper functions/classes in C/C++ anyway?


No, I'd need operator overloading.



			
				js58 said:
			
		

> However, I find C's libraries rather lacking myself, particularly regarding string handling.


http://www.google.com/search?q=c+string+library

Take your pick.



			
				js58 said:
			
		

> On the other hand, strings are handled almost automatically in Java.


Almost.

On a side note, it really grates my nerves that almost every fucking language under the sun has decided that "addition" and "concatenation" are somehow the same operation.



			
				js58 said:
			
		

> I find the library also rather small.


Er.  When I say "library" alone, I mean _the collection of every single library that exists for the language_.



			
				js58 said:
			
		

> I wasn't talking about null pointers


I..  ah, nevermind.



			
				js58 said:
			
		

> It's an internet-based language, meant for mobility, applets, client appls, etc.; I think it's pretty clear it was meant for small- to medium-size applications. And it's general-purpose! Sure you can probably find me a language specifically suited for this and that, but Java accommodates generic programs very well without having to spend time and resources learning every new language that comes along. And it's very easy to learn itself.


It's heavy, needlessly verbose, slow, and a memory hog.  These are not the sorts of things that make for good small applications.

I don't know what an "internet-based language" is; even bash has facilities for communicating with the Web.


----------



## js58 (Nov 2, 2007)

> Why treat big integers any differently? This is OO. The entire point of an object-oriented language is to abstract things into a representation that we don't care about the underlying types.



Hey, I won't claim to know why it's currently built that way. They probably just didn't think of it. But I've never had to use big integers so far, nor I have I seen anyone with a need to. My guess is they'll incorporate it when there is actually a demand for it. Happened with generics, etc. Thankfully, the language is always under development. It has time to improve if it needs to. That's an advantage to it. It's well centralized and standards catch on.



> In C, yeah, you'd need to use something gross like the GMP library and write negate(add(pow(b,2),sub(mul(mul(a,b),c) yaddita yaddita. But if you're using C++ (even with it's otherwise shit OO model), hey, overloaded operators happen and you can write things that look like they make sense. Which is why I say that Java's not an OO language, because it doesn't provide these tools. Ruby automatically promotes Fixnums into Bignums when they get too big, and you can add Fixnums to Bignums without caring how it's stored.



I'm personally not too keen on operator overloading. In C++, they seem to encourage misuse, destroy abstraction, and confuse the heck out of operator precedence. The syntax might look like it makes sense, but half the time it goes horribly wrong in practice. For the moment, at least, I'd rather not have any at all than have it so horribly done and be forced to spend hours figuring out how other people assumed I was going to use their code with it. If a generic standard-enough language manages to pull them off right, or if Java evolves to incorporate it decently, then I might consider making it an essential part of my OO practices. But it's a syntax convenience more than anything, with only a slight improvement in readability; I'd be impressed if it actually affected your OO designs (on a higher level, I mean).

I don't know about Ruby, unfortunately, but that sounds like a good idea.



> When I say "library" alone, I mean the collection of every single library that exists for the language.



Oh, I was talking about the standard libraries.



> No, I'd need operator overloading.



Still need to write a class or other. And debug it (ughughugh...). Varying data sizes are not built into C/C++ is what was implied.



> It's heavy, needlessly verbose, slow, and a memory hog. These are not the sorts of things that make for good small applications.



A good part of the industry would disagree with you there. There are compromises to be made. Raw speed vs portability. Readability/maintainability vs raw code length. Performance is hardly always the dominating factor when writing a small application. In fact, if your basic algorithms are well chosen, it can often be neglected, and Java still fairs well in most cases (hey, even games are written in it, for cell phones, 3d web, etc.). And if I plan to modify my program a year from now, no matter how small, I'd like to be able to read it without having to write/maintain hundreds of lines of comments. And your employer would like to be able to read what you've written, too.



> I don't know what an "internet-based language" is



What it was designed for. I don't care that it uses internet facilities now, but it's simply to point out that its original intended uses, such as small web applet application development, are reflected in its design choices (see above).

I understand where you people are coming from, but now that I've actually used Java enough, I simply no longer consider its occasional flaws anywhere near as important as its advantages when it comes to putting a small- or medium-sized OO design for a generic program into practice. It's good enough that I'm not in a hurry to find an alternative. And I find its biggest flaws, even if they exist, are exaggerated and often justified in inappropriate context.

It's not my favorite language, but some times I'll read over an app I just wrote in it and go, "THANK GOD I didn't have to write this in C/C++". That's enough to make me see its uses.


----------



## Eevee (Nov 2, 2007)

js58 said:
			
		

> I'm personally not too keen on operator overloading. In C++, they seem to encourage misuse, destroy abstraction, and confuse the heck out of operator precedence.


Only because anyone who uses _all_ of C++ is first introduced to operator overloading in the boneheaded form of iostream.  Perl manages to get along with overloading in exactly the way it was intended: making the code do what I mean without my even realizing overloading is happening.



			
				js58 said:
			
		

> For the moment, at least, I'd rather not have any at all than have it so horribly done and be forced to spend hours figuring out how other people assumed I was going to use their code with it.


...except the native operations and BigInteger come from the same place, so the operator use is equally likely to be screwy with both natives and bigs.



			
				js58 said:
			
		

> If a generic standard-enough language manages to pull them off right


Perl, Python, Ruby, Lisp, Haskell, Smalltalk, Fortran, Perl 6, Prolog, D, Delphi, VB2005, C#, Eiffel, Algol, Ada...

You seem to be under the impression that any feature in Java but not C is rare.



			
				js58 said:
			
		

> But it's a syntax convenience more than anything, with only a slight improvement in readability


And this is why I hate Java.  Seems like the entire language is built around the idea that representing the most fundamental ideas with 20-character strings and parentheses is somehow readable.  One of the things that really grates my nerves about PHP, too; both languages try to cram too much power into C's limited syntax, and I'm left feeling like I have to read or write a novel to get anything done.

(b.neg().add(b.pow(2).sub(a.mulBy(c.mulBy(4)).sqrt())).divBy(a.multBy(2))

-b + sqrt(b**2 - 4 * a * c) / (2 * a)



			
				js58 said:
			
		

> Still need to write a class or other.


I am pretty sure such a thing would already exist in C++.



			
				js58 said:
			
		

> A good part of the industry would disagree with you there.


Following the popularity of PHP, I came to the conclusion that a good part of the industry is dain-bramaged.



			
				js58 said:
			
		

> Performance is hardly always the dominating factor when writing a small application.


I hack in the Perl family.  Sub-optimal performance doesn't exactly keep me up at night.  But I can *tell* when I'm running a Java application; the JVM might be speedy, but the GUI libraries are absolutely abysmal.  On the other hand, I never even suspected that e.g. the BitTorrent client was written in Python until I read about it much later.



			
				js58 said:
			
		

> And if I plan to modify my program a year from now, no matter how small, I'd like to be able to read it without having to write/maintain hundreds of lines of comments. And your employer would like to be able to read what you've written, too.


If you're spending on average more time writing comments (not documentation) than code, then something is very wrong with either your language or your approach.



			
				js58 said:
			
		

> What it was designed for. I don't care that it uses internet facilities now, but it's simply to point out that its original intended uses, such as small web applet application development, are reflected in its design choices (see above).


I don't see how Internet connectivity affects many design choices.  If a general-purpose language fundamentally cannot support Internet functionality well, it's not really general-purpose.



			
				js58 said:
			
		

> It's not my favorite language, but some times I'll read over an app I just wrote in it and go, "THANK GOD I didn't have to write this in C/C++". That's enough to make me see its uses.


"Easier than C" is not exactly what I'd put at the top of a feature list.


----------



## js58 (Nov 2, 2007)

> ...except the native operations and BigInteger come from the same place, so the operator use is equally likely to be screwy with both natives and bigs.



"likely to be"? I'll reserve judgment until I've seen it, thank you. Not to mention most of what Java has done has been in an attempt to improve on C/C++, so on the contrary, I would be surprised if they attempted it conscious of the issues and messed it up. Their mindset is sort of conservative regarding syntax (but it does cave if demand is high enough and reasonable).



> Perl, Python, Ruby, Lisp, Haskell, Smalltalk, Fortran, Perl 6, Prolog, D, Delphi, VB2005, C#, Eiffel, Algol, Ada...



Like I said, a _generic standard-enough language_. I'm already thinking of looking into C# and Perl, like I also already said. But what's the rush? Even without this list, I assumed there were some languages capable of doing it right, but operator overloading is far from a high enough priority to immediately ditch an otherwise accomplished and established language which already satisfies all of my design issues and then some, not to mention I will most likely practice professionally.

Seriously though, "Prolog"? "Fortran"? That's a bit of a stretch, don't you think?



> And this is why I hate Java. Seems like the entire language is built around the idea that representing the most fundamental ideas with 20-character strings and parentheses is somehow readable. One of the things that really grates my nerves about PHP, too; both languages try to cram too much power into C's limited syntax, and I'm left feeling like I have to read or write a novel to get anything done.



On the contrary, I find it partly helpful, for Java at least. It encourages you to factor and separate expressions onto separate lines, which is _much_ easier to debug, not to mention much easier to follow when reading other people's code. And shortcuts exist, FYI (sometimes to my dismay).

In any case, I have yet to encounter your arithmetic caricature in practice. Please don't tell me you're the type to write everything on a single line like that.



> If you're spending on average more time writing comments (not documentation) than code, then something is very wrong with either your language



I agree.

I'd like to think my "approach" is to write efficient code that's clear enough AND commented enough for the next person who has to work on it to be able to follow. I hope to god it would be part of yours, too. I'll write the amount of comments necessary to ensure that, thank you, which I find depends highly on the obscurity of the language's syntax and semantics in complex code.



> I don't see how Internet connectivity affects many design choices. If a general-purpose language fundamentally cannot support Internet functionality well, it's not really general-purpose.



Think Applets. You want your website to use Applets, you've got to make sure they work for anybody accessing the site. Any browser, any OS, any machine, any device. -> VM design -> Portability/Compatibility.



> "Easier than C" is not exactly what I'd put at the top of a feature list.



It's an example. An example. And no, not "easier than C", make it, "much more time-efficient in the long and short term".


----------



## Oni (Nov 2, 2007)

Wow, one can certainly learn quite a bit of general programming language knowledge by reading these discussions here at the Fur Affinity forum.

*continues to read and fish out the facts*

;d


----------



## Pi (Nov 3, 2007)

js58 said:
			
		

> Seriously though, "Prolog"? "Fortran"? That's a bit of a stretch, don't you think?



No. If you're doing anything with physics, a lot of your real work is going to be done in Fortran 90, because the compilers for that language know how to optimize the living cocksmack out of array/vector/matrix code. 

Prolog is a small bit of a stretch, but it is still useful for AI programming, and Erlang's syntax, which is an up-and-coming language, is fairly-well based on Prolog's.


----------



## Eevee (Nov 3, 2007)

I find it hilarious that he picked on Fortran and Prolog rather than, say, Perl 6.

Fortran is obviously standard-enough that all programming geeks have heard of it.  Prolog is..  well, it's just cool.

Which reminds me: I don't understand this veiled implication that people come with a preallocated number of programming language slots they can fill.  The Web is flooded by PHP junkies who don't know anything because they learned print() and thought that was the end of it.  Learning new languages is not all that difficult and never a waste of time.  Even Brainfuck will give you a new way of looking at problems.


----------



## js58 (Nov 4, 2007)

Pi said:
			
		

> No. If you're doing anything with physics, a lot of your real work is going to be done in Fortran 90, because the compilers for that language know how to optimize the living cocksmack out of array/vector/matrix code.



I didn't know the 90s version was being used for physics, which is neat I guess, but that doesn't change the fact I haven't seen it mentioned in a job listing in the last... um... ever. I guess I'll have to take your word for it, for now.



			
				Eevee said:
			
		

> Prolog is..  well, it's just cool.



Oh, sure, it's "cool", but would you want to write the majority of your programs in it, interfaces and all? That was the point. Something _generic_.

I haven't seen any Perl 6 nor a few of the others, so I thought it would be prudent not to comment on those ones.



			
				Eevee said:
			
		

> Which reminds me: I don't understand this veiled implication that people come with a preallocated number of programming language slots they can fill.  The Web is flooded by PHP junkies who don't know anything because they learned print() and thought that was the end of it.  Learning new languages is not all that difficult and never a waste of time.  Even Brainfuck will give you a new way of looking at problems.



I agree completely, for educational purposes and without considering time restrictions. But it's one thing to learn a new language to build an application from scratch, and another to have to rewrite all the reusable code you've already written or half a program just to exploit a few extra language features, a.k.a. starting over. Earlier, I was thinking more in terms of a product line, where it's the marginal costs that are most relevant. Switching languages can be costly, and you do it only when it's absolutely worth it. This impacts me personally identically, writing a lot of programs; nothing sucks more than to have to rewrite all your functions you wrote explicitly to be reusable in the first place just because someone decided it would be best if the project was done in marginally-improved x#++- (assuming you can't make cross-language libraries out of them, of course).

Following the same motivations, personally, I'm willing to learn a new language anytime, but 1) do I have the time?, 2) do the educational and professional prospects justify that time investment, compared to what I already retain in both fields, and compensate each other realistically? For example, I'm eager to look at C#, but from what I hear, it's very much like Java and C/C++, so the only real incentive I have to try it comes from the potential job opportunities. If it weren't for its popularity, honestly, I would most likely ignore it, in this case. On the other hand, some I'd be willing to learn purely for their educational value even if they are hardly used (as I have for assembly, for example, which was definitely worth it), but those would take the most time, and at the moment, that time should be more urgently spent on languages which might actually land me a job.

My life story

(and yes, I'm talking about time management while writing walls of text about programming languages for random people on a furry forum, so sue me )


----------



## Eevee (Nov 4, 2007)

js58 said:
			
		

> I haven't seen any Perl 6 nor a few of the others, so I thought it would be prudent not to comment on those ones.


It's funny because there isn't actually a finished Perl 6 interpreter yet  8)



			
				js58 said:
			
		

> I agree completely, for educational purposes and without considering time restrictions. But it's one thing to learn a new language to build an application from scratch, and another to have to rewrite all the reusable code you've already written or half a program just to exploit a few extra language features, a.k.a. starting over.


It's not starting over; once you know a language, _any_ other is easier to pick up, and it just gets easier the more you know.  And, of course, once you have a cursory knowledge of it, it's easy to get to the point where you can write programs without your nose constantly in a manual.



			
				js58 said:
			
		

> Earlier, I was thinking more in terms of a product line, where it's the marginal costs that are most relevant. Switching languages can be costly, and you do it only when it's absolutely worth it.


Oh, of course.  Plenty of software written in COBOL is still around because it's too expensive and risky to port it to something else.


----------



## Pi (Nov 4, 2007)

js58 said:
			
		

> the x86 ASM syntax is straightforward and efficient



Excuse me? Are we talking about the same x86?



			
				js58 said:
			
		

> In any case, many of the restrictions you list have no effect when programming Win32 PE files, for example. At this point in time, you can skip the FPU stack with SSE2 floating-point instructions, memory is treated as flat (no segments), and you can use most registers for *most* basic operations and combine them. *All exceptions* are documented strictly. *It takes 2 seconds to look up*. And if you're not happy with it, you just put it in a macro. An inconsistent binary instruction format hardly matters with an assembler!



(Emphasis mine)

Wow just wow! Intel added features to their instruction set that other architectures have had for 20 years! Except for, oh wait, if you want to be portable to computers made, say, 10 years ago, you have to target i586, and you don't get to rely on SSE! It's great! PLUS, you have to have the manual at hand to figure out that MOST operations work in a sane manner, except for the one you're trying to use at the moment! Such a wonderful, consistent, efficient architecture!



			
				js58 said:
			
		

> If x86 ASM were as horrible as you describe, there's no way I would have chosen it over C/Pascal for over half my programs (no stupid jokes, please).



And what advantages did you get out of coding your apps in assembly?


----------



## js58 (Nov 4, 2007)

Eevee said:
			
		

> It's funny because there isn't actually a finished Perl 6 interpreter yet



Oh now that was low! -_-



			
				Eevee said:
			
		

> It's not starting over; once you know a language, any other is easier to pick up, and it just gets easier the more you know. And, of course, once you have a cursory knowledge of it, it's easy to get to the point where you can write programs without your nose constantly in a manual.



That's certainly true; but what I meant was (perhaps I used the wrong words), you can't recover the man-hours already spent writing the code "physically" either way. You switch language, you might have to rewrite all your reusable code from scratch, AND re-debug it from scratch. It's prone to introducing the worst kinds of bugs, well, like the example you evoke with COBOL. If you wrote your original design and algorithms fairly independently enough (well) from the first language and the second language is different enough from the first, then you might even be doubling your total man-hours. That's pretty close to starting over, in a way.

But in the context you used for the expression, I can't argue with you there.



			
				Pi said:
			
		

> (Emphasis mine)
> 
> Wow just wow! Intel added features to their instruction set that other architectures have had for 20 years! Except for, oh wait, if you want to be portable to computers made, say, 10 years ago, you have to target i586, and you don't get to rely on SSE! It's great! PLUS, you have to have the manual at hand to figure out that MOST operations work in a sane manner, except for the one you're trying to use at the moment! Such a wonderful, consistent, efficient architecture!



What I mean is, these details are pretty negligible when you're doing the actual coding. You look it up, it takes 2 seconds, you find out it applies to most of the other exceptions in the same fashion, it's generalizable, you remember it the next time, there aren't a million of them, it's really not hard. If it was somehow an impedance to processor performance or to me coding in it, then I might object. I don't expect it to be 100% perfect; I expect it to work and live up to its intended performance, that is, compromise readability and performance/low-level access properly, as assembly is meant to do, without me having to take huge detours. And it does that just fine. Considering it's the most popular and cost-effective architecture out there, I'd say coding in it is rather easy, and so the global performance gains and cost-accessibility easily justify whatever exceptions might have been introduced into the architecture, in this sense.

SSE2 is becoming a standard in the Win32 environment. C/C++ compilers give you the option to produce all floating-point code with it, and apps are catching on. No processor ships without it now. And please, you simply _do not_ write in ASM for compatibility, and if you're expecting that, you're on the wrong track completely. It's useful for low-level manipulation of binary data, odd data formats, system calls, etc. Like I said, if I absolutely need complex equations for the FPU stack -> code it in C and call it in a DLL or something. Otherwise, right now, SSE2 is popular enough that you can assume most people will have it, and it will only get more so (there was even talk that the FPU stack was going to get ditched completely, though I don't know what happened with that).

Yes, it's lame they didn't come out with it sooner to replace the FPU stack, but as I recall it, the stack was a performance decision on a bottleneck component. So if you're asking me to justify their decision, well, I wish they had found a better way, but if it were to have sacrificed performance or severely affected costs, then I'm fine with it having been implemented this way. In any case, it's really the only major design factor for the x86 line that I actually have a problem with (and, to note, it IS still very usable for basic calculations; it's for more complex calculations and condition testing that it mostly becomes a pain).



			
				Pi said:
			
		

> And what advantages did you get out of coding your apps in assembly?



Easy/actually possible bridging with other languages and APIs. Smaller file/code size/memory footprint vs compiled applications. Faster (one-step) assembly with virtually no overhead (linking, etc.). No bloody awful type checking where it only gets in the way. No restrictions on data formats. In general, more control over formatting and parameters. And of course, the educational factor. Like I said, it's mostly comparable to C when using macros, not requiring that much more code, but opens up a few more doors in case you need them (and most of the C library functions you can easily patch into via DLLs or even libraries shipped with the assemblers, such as MASM). Very manageable for small- and even medium-sized applications (assuming you can modulate properly, as with any other language). And debugging is fairly easy with OllyDbg since it follows fundamentally the same syntax. The only case where I definitely prefer C is when complex mathematical expressions are involved or there if it is a very general program requiring no real low-level access (in which case I'll probably write it in Java anyway ).


----------



## Pi (Nov 4, 2007)

"[the FP stack is] really the only major design factor for the x86 line that I actually have a problem with"

I'm sorry to hear that, because it suggests to me that you have not seriously considered machine architecture from an objective sense. Maybe you should try looking at other architectures and seeing how horrible the x86 really is compared to, say, Power5.

* Easy/actually possible bridging with other languages and APIs.

Because that's entirely unique to assembly:

```
--haskell    
foreign import ccall "buzzlib.h buzz" my_buzz :: Int -> Ptr (Int) -> IO ()
;;; common lisp
  (defcfun "curl_global_init" curl-code
    (flags :long))
# Ruby
require 'dl/import'
module LIBC
    extend DL::Importable
    dlload "libc.so"
    extern "int strlen(const char *)"
end
```

* Faster (one-step) assembly with virtually no overhead (linking, etc.).

That's not really an advantage unless you're on a really fucking slow machine.

* No bloody awful type checking where it only gets in the way.

ugh. no. shut up. You don't get to talk about strong typing or type checking until you've used it. C and Java's type mechanisms are godawful. Go write some haskell or OCaml, or go the opposite direction and say fuck types altogether and start writing Ruby, where you program according to the operations on your data.

* No restrictions on data formats. In general, more control over formatting and parameters.

Wow. I don't even know what you mean by that. Assembly is about as strict in data formats as you can get. You have, on the x86, numbers, smaller numbers, and even smaller numbers (as well as pretending the smallest class of numbers are letters). And floating-point numbers.

Do you mean output formatting? Because unless you're calling out to another library, in which case you're only tangentially using assembly, you can only output numbers, smaller numbers, even smaller numbers, or pretend the latter is a letter.

If you're wanting to interface with other code, then no, you don't have control over parameters. You HAVE to adhere to your architecture and OS calling convention. Contrast with a Ruby function:

```
def bizarre(a=0,b="wat", *rem, &block)
    puts a.to_s.upcase
    puts b.reverse
    yield if rem[2] == "contrived" and block_given?
end
bizarre # prints out "0ntawn"
bizarre(1) # prints out "0ntawn"
bizarre("qqq","abc") # prints out "QQQncban"
bizarre(1,[1,3,4],:fuck, [], "contrived") { puts "wonky blocky" } # prints out "0n4n3n1nwonky blockyn"
```

Sure, it's a contrived example. But I've had to write a slightly-less-complicated function that takes two mandatory arguments, an arbitrary amount of arguments after those, and a possible block! Which I will reproduce here for the record:


```
class << Array
    def multi(x,y,*args, &block)
        if args.length > 0 and block_given?
            raise ArgumentError, "wrong number of arguments (#{args.length + 2} for 2)"
        elsif args.length > 1 and not block_given?
            raise ArgumentError, "wrong number of arguments (#{args.length + 2} for 3)"
        end

        Array.new(x) do
            if block_given?
                Array.new(y, &block)
            else
                Array.new(y, args[0])
            end
        end
    end
end

#usage:
Array.multi(5,5,0)
# => [[0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0]]
Array.multi(5,5) {""}
# => [["", "", "", "", ""], ["", "", "", "", ""], ["", "", "", "", ""], ["", "", "", "", ""], ["", "", "", "", ""]]
```


----------



## net-cat (Nov 4, 2007)

Damn. Ruby looks a lot like Python...

Here's the most fun language ever.

```
library STD;
use STD.textio.all;
library IEEE;
use IEEE.std_logic_1164.all;
use IEEE.std_logic_textio.all;


entity alu_32 is
  port(inA    : in  std_logic_vector (31 downto 0);
       inB    : in  std_logic_vector (31 downto 0);
       inst   : in  std_logic_vector (31 downto 0);
       result : out std_logic_vector (31 downto 0));
end entity alu_32;


architecture schematic of alu_32 is   
  signal aresult : std_logic_vector (31 downto 0);
  signal bresult : std_logic_vector (31 downto 0);
  signal andresult : std_logic_vector (31 downto 0);
  signal mresult : std_logic_vector (31 downto 0);
  
  signal subop   : std_logic;
  signal cmplop  : std_logic;
  signal sllop   : std_logic;
  signal srlop   : std_logic;
  signal andop   : std_logic;
  signal mulop   : std_logic;
  
  signal scop    : std_logic;
  signal slop    : std_logic;
  signal notInB  : std_logic_vector (31 downto 0);
  signal rrop    : std_logic;
  signal sub_subop : std_logic;
  signal sub_cmplop : std_logic;
  signal sub_srlop : std_logic;
  signal sub_sllop : std_logic;
  signal sub_andop : std_logic;
  signal sub_mulop : std_logic;
  signal muxBct1 : std_logic;
  signal muxBct2 : std_logic;
  signal adderB  : std_logic_vector (31 downto 0);
  
begin  -- schematic
  --
  --   REPLACE THIS SECTION FOR PROJECT PART 1
  --   (add the signals you need above "begin"
  --
  
  -- subop, cmplop, sllop, srlop, andop

  
  chk_rrop: entity WORK.equal6 port map (	inst  => inst(31 downto 26),
											test  => "000000",
											equal => rrop);
  
  chk_sub_subop: entity WORK.equal6 port map (	inst  => inst(5 downto 0),
												test  => "100010",
												equal => sub_subop);
  
  chk_sub_cmplop: entity WORK.equal6 port map (	inst  => inst(5 downto 0),
												test  => "000101",
												equal => sub_cmplop);
  
  chk_sub_srlop: entity WORK.equal6 port map (	inst  => inst(5 downto 0),
												test  => "000011",
												equal => sub_srlop);
  
  chk_sub_sllop: entity WORK.equal6 port map (	inst  => inst(5 downto 0),
												test  => "000010",
												equal => sub_sllop);
  
  chk_sub_andop: entity WORK.equal6 port map (	inst  => inst(5 downto 0),
												test  => "000100",
												equal => sub_andop);
												
  chk_sub_mulop: entity WORK.equal6 port map (	inst  => inst(5 downto 0),
												test  => "011000",
												equal => sub_mulop);
  
  -- subop, cmplop, sllop, srlop, andop
  
  subop <= sub_subop and rrop;
  cmplop <= sub_cmplop and rrop;
  srlop <= sub_srlop and rrop;
  sllop <= sub_sllop and rrop;
  andop <= sub_andop and rrop;
  mulop <= sub_mulop and rrop;
  
  andresult <= inA and inB;
  
  notInB <= not inB;
  scop <= subop or cmplop;
  muxA: entity WORK.mux_32 port map (	in0    => inB,
										in1    => notInB,
										ctl    => scop,
										result => adderB);
  
  muxBct1 <= andop or mulop;
  muxBct2 <= sllop or srlop or mulop;
  muxB: entity WORK.mux32_4 port map (	in0    => aresult,
										in1    => andresult,
										in2    => bresult,
										in3    => mresult,
										ct1    => muxBct1,
										ct2    => muxBct2,
										result => result);

  shift: entity WORK.bshift port map (	left    => sllop,
										logical => '1',
										shift   => inst(10 downto 6),
										input   => inB,
										output  => bresult);
  
  adder: entity WORK.add32 port map(a    => inA,
                                    b    => adderB,
                                    cin  => subop,
                                    sum  => aresult,
                                    cout => open);

  mult: entity WORK.pmul16 port map(	a => inA(15 downto 0),
										b => inB(15 downto 0),
										p => mresult);
										

end architecture schematic;  -- of alu_32
```


----------



## Pi (Nov 4, 2007)

net-cat said:
			
		

> Damn. Ruby looks a lot like Python...
> 
> Here's the most fun language ever.
> (verilog? vhdl? elided)



Only sort of. python programmers would have a fit with the $, @, and @@ specifiers for the visibility of attributes. (but they have no problem with utterances like __slots__ = ('foo',) for some reason (and also somehow think they're more readable than attr_accessor :foo))


----------



## net-cat (Nov 4, 2007)

VHDL. Part of one of my projects for a Computer Architecture class I took a couple of semesters back.

I don't know. I don't really give a fuck what the Python writer's opinions are.

As far as I'm concerned, a programming language is a programming language.


----------



## Pi (Nov 4, 2007)

net-cat said:
			
		

> VHDL. Part of one of my projects for a Computer Architecture class I took a couple of semesters back.
> 
> I don't know. I don't really give a fuck what the Python writer's opinions are.
> 
> As far as I'm concerned, a programming language is a programming language.



True, but some programming languages are fun and others make you fight them uphill to get anything done.


----------



## net-cat (Nov 4, 2007)

There is that.

Of course, different languages are designed for different things. I wouldn't try to code a microcontroller in PHP or Perl any more than I'd try to do a website in Assembler or C.


----------



## Eevee (Nov 5, 2007)

I would try both of those.


----------



## Pi (Nov 5, 2007)

net-cat said:
			
		

> There is that.
> 
> Of course, different languages are designed for different things. I wouldn't try to code a microcontroller in PHP or Perl any more than I'd try to do a website in Assembler or C.



Actually, there was an FPGA that was programmed with and in Haskell. "The Reduceron" - http://www-users.cs.york.ac.uk/~mfn/reduceron/

One of my pet topics in architecture is efficient execution of non-imperative (functional, logic) languages. This is probably because I have one of the only working TI Explorer Lisp machines.


----------



## js58 (Nov 5, 2007)

Pi said:
			
		

> ugh. no. shut up. You don't get to talk about strong typing or type checking until



You had me interested up until that point.

You know, you seem to be remarkably well-versed on the topic of programming languages, I'm not ashamed to say probably more than I am, and, I must note, you put forth very interesting and educated arguments, including the most recent ones, which makes it all the more of a shame that you insist on articulating them using the wit of a 12-year-old. I'm turned off. Good day.


----------



## Pi (Nov 5, 2007)

js58 said:
			
		

> Pi said:
> 
> 
> 
> ...



And you seem to be remarkably well-versed in reading the first few sentences of something and stopping for arbitrary and baseless reasons. Have fun. 8)

As well, I feel that I've earned the right to articulate my thoughts using whichever sense of wit I feel appropriate. Having BEEN in the position of mindlessly drooling over some feature or language while having no sense of the design behind it, and then LEARNING just why I was horrendously wrong tends to make one feel particularly entitled.

Think of it this way: When you've demonstrated that you're at the same intellectual level as myself in regards to the topics we are discussing, then you can complain when I treat you as anything less. If that makes me an elitist, then so fucking be it.


----------



## js58 (Nov 6, 2007)

The level of intellectual superiority (or experience) which you may claim to have - you could be Stephen Hawking, for all I care - has little to no relation to the level of respect that you would give in a thread where it was _specifically_ requested by the OP, let alone the general case, presumably to avoid this exact type of situation. This is a _language debate thread_, and no one's personal pissing contest. If you honestly see unilateral arrogance as a "baseless reason" for my annoyance, then I sincerely hope you enjoy the company of your ego.

To everyone else: I do apologize for interrupting the conversation in this manner; I realize it is unfruitful and I should have expected no more. Otherwise, I had enjoyed it thus far and I hope you will continue.


----------



## net-cat (Nov 6, 2007)

Eevee said:
			
		

> I would try both of those.



Well, writing a website in Assembler or C would certainly be possible, since it's all basically reduced to Assembler at some point anyway.

I'd be interested to see a Perl program compiled down to work on a microprocessor with 2KB of RAM and 32KB of Flash. (Keeping in mind, that you'd also need to cross compile all the requisite libraries and whatnot. I'd be shocked if it were possible. Although it'd be interesting to see a processor with a built in Perl interpreter.)



			
				Pi said:
			
		

> Actually, there was an FPGA that was programmed with and in Haskell. "The Reduceron" - http://www-users.cs.york.ac.uk/~mfn/reduceron/


That doesn't surprise me. We touched on that sort of thing in my FPGA class. (Although the chip we're working with just takes a bitstream compiled from Verilog or VHDL.)


----------



## Pi (Nov 6, 2007)

js58 said:
			
		

> The level of intellectual superiority (or experience) which you may claim to have - you could be Stephen Hawking, for all I care - has little to no relation to the level of respect that you would give in a thread where it was _specifically_ requested by the OP, let alone the general case, presumably to avoid this exact type of situation.


So, you're saying that I should respect your opinion even though they're demonstrably false?



			
				js58 said:
			
		

> This is a _language debate thread_, and no one's personal pissing contest.



Yet you choose to quite literally ignore the debate and start this pissing contest with me over your offense regarding my wit.



			
				js58 said:
			
		

> If you honestly see unilateral arrogance as a "baseless reason" for my annoyance, then I sincerely hope you enjoy the company of your ego.



Oh, the old "ur mean so u dont have frends" rebuttal. Cute. For what it's worth, the friends I do have are all people that I don't have to get arrogant with, or understand when I do.

Either way, you veered off topic in a bleeding-heart manner instead of sucking it up. Are we going to get back to the language debate, or are you going to try and piss on me about how i'm an arrogant ass, which I have already freely admitted.


----------



## Pi (Nov 6, 2007)

net-cat said:
			
		

> I'd be interested to see a Perl program compiled down to work on a microprocessor with 2KB of RAM and 32KB of Flash. (Keeping in mind, that you'd also need to cross compile all the requisite libraries and whatnot. I'd be shocked if it were possible. Although it'd be interesting to see a processor with a built in Perl interpreter.)


I'm not sure how big  microperl is, but hey, it might do.



			
				net-cat said:
			
		

> Pi said:
> 
> 
> 
> ...



I'm not sure how the HDL that The Reduceron used worked, but I'd imagine that it would probably compile down to Verilog/VHDL at some point anyway.

(edited to fix quoting)


----------



## js58 (Nov 6, 2007)

Pi said:
			
		

> So, you're saying that I should respect your opinion even though they're demonstrably false?



There is _no reason_ to be provocatively disrespectful in a thread like this, whether anyone is right or wrong. I can not accept scientific debate and (dis)proof as a conduit for flagrant insults.



			
				Pi said:
			
		

> Yet you choose to quite literally ignore the debate and start this pissing contest with me over your offense regarding my wit.



->


			
				Pi said:
			
		

> that get a fuck you





			
				Pi said:
			
		

> Fuck you.





			
				Pi said:
			
		

> Fuck you.





			
				Pi said:
			
		

> Fuck you.





			
				Pi said:
			
		

> , fuck you.





			
				Pi said:
			
		

> Come back when you've stopped drinking apple juice





			
				Pi said:
			
		

> ugh. no. shut up.


...

The pissing contest was quite clearly there all along, the only missing detail being that it was a one-way endeavor up until now. Please, tell me there is no way you could have easily reworded any of those into a respectful and non-provocative form.



			
				Pi said:
			
		

> For what it's worth, the friends I do have are all people that I don't have to get arrogant with, or understand when I do.



Precisely. We are not necessarily those people. In fact, it should be blindingly obvious that the odds are that few of us will be of those people. I hardly care whether you're genuinely an asshole or not, nor that being with your friends, but I expect you and anybody with the intent of posting in a civil debate thread such as this one to know when is the time and place for these vents. Do you treat _everybody_ you meet this way?

Note: I wasn't trying to comment about your friends, and I'm sorry if it came off that way. I meant it in the sense that I would surely not be part of the company in question, and it was badly worded at that.



			
				Pi said:
			
		

> Either way, you veered off topic in a bleeding-heart manner instead of sucking it up.



I am sorry I had to bring it up, but then I see no reason why I should have to "suck up" any of this taunting when it was clearly emphasized that it would have no place in this thread to begin with. Damned if I do, damned if I don't.


----------



## Pi (Nov 6, 2007)

js58 said:
			
		

> Pi said:
> 
> 
> 
> ...



Well, sorry that the world isn't a great big happy place where we can all get along with puppies and flowers. What's your point?



			
				js58 said:
			
		

> Pi said:
> 
> 
> 
> ...



But I don't respect the things I said "fuck you" to. So instead of taking offense to how I don't respect them, prove (via debating my arguments, instead of how I phrase them) they're worth respect.



			
				js58 said:
			
		

> (personal bullshit elided)
> I am sorry I had to bring it up, but then I see no reason why I should have to "suck up" any of this taunting when it was clearly emphasized that it would have no place in this thread to begin with. Damned if I do, damned if I don't.



the OP hasn't come back to this thread. If you have an actual complaint about my conduct, there are other places than the thread where you can lodge them. Can we get back to the programming language debate instead of harping on how much of a (self-admitted, twice) asshole and elitist I am?


----------



## js58 (Nov 6, 2007)

Pi said:
			
		

> What's your point?



The world's not perfect; that's hardly a reason to invite its flaws to every occasion. This is a localized argument. The point is to _exclude_ normal elements which may oppress or discourage the free exchange of ideas, right or wrong, _not promote_ them.



			
				Pi said:
			
		

> So instead of taking offense to how I don't respect them, prove (via debating my arguments, instead of how I phrase them) they're worth respect.



Unfortunately, point missed. Civil arguments are defined by the way they are phrased, regardless of content. Blatant offense does not help (dis)prove the validity or, as you might put it, "worthiness of respect" of an argument, and only constrains the future flow of ideas from the affected parties and others, however absurd you, either from fact or from your point of view, believe they might be. There is an overall positive net difference that comes from saying something such as, "I whole-heartedly disagree", versus something such as what has been enumerated here multiple times.



			
				Pi said:
			
		

> Can we get back to the programming language debate



Yes, but without teen-years swear statements every three lines perhaps, if that's not too much to ask? Seriously, is it? You tell me to obtain moderator approval, but I ask you, by god, is that really necessary?

Now that it crosses my mind (perhaps a little late), if you have A.D.D. or some other form of disability which makes this request impossible for you, please P.M. me and I will understand - but I don't buy "I'm an elitist asshole" as an excuse. I know a few elitist assholes (trust me), and while they manage to get on my nerves, they can usually recognize the absurdity and futility of personal disrespect in a debate.


----------



## Pi (Nov 6, 2007)

js58 said:
			
		

> (stuff that I just don't want to respond to)
> Now that it crosses my mind (perhaps a little late), if you have A.D.D. or some other form of disability which makes this request impossible for you


No. Get over it, and no. Welcome to the Real World.


----------



## Rilvor (Nov 6, 2007)

I'm going to ask that no further personal arguments be brought about in this thread, as it is not the intended purpose of this thread, please refrain from thread derailment and personal arguments and attacks please, thanks 

We now return to our regularly scheduled Language Debate.


----------



## Rostam The Grey (Nov 6, 2007)

Eevee said:
			
		

> Paul Revere said:
> 
> 
> 
> ...



I haven't done a lot with MSIL but it's my understanding that it can be decompiled very easily into useful .Net code. Unlike previous decompilers I've used where the resultant code was very crappy and not easy to understand primarily because the compilers were compiling it into machine code and creating their own 'optimized' versions of things.


----------



## Rostam The Grey (Nov 6, 2007)

Let's do physics in Scheme!!!


----------



## Pi (Nov 6, 2007)

Rostam The Grey said:
			
		

> Eevee said:
> 
> 
> 
> ...



I don't think that's too great of a reason to like or dislike something -- "The language doesn't promote security by obscurity" is how that argument ends up boiling down to.


----------



## Rostam The Grey (Nov 6, 2007)

Pi said:
			
		

> I don't think that's too great of a reason to like or dislike something -- "The language doesn't promote security by obscurity" is how that argument ends up boiling down to.



Agreed. For me, it doesn't matter. I don't care if someone decompiles my work and I like C# enough that I would still use it even if I cared.


----------



## DragonTrew (Nov 7, 2007)

Cobol people! You're missing Cobol!

Ok, Ok, it's not a desk/server programming language. But our goods MAINFRAMES still using it! Almost EVERY good money-based "industries" (like banks, stock exchange, and many others), will not forget about their mainframes that soon...

But I personally love to program in Java and C... Sql in on the go and assembler is "too much for my patience"...


----------



## Eevee (Nov 7, 2007)

Java, C, and assembly are not the only languages on the planet  D:

I didn't mention COBOL because I'm not familiar enough with its syntax beyond "lol".


----------



## DragonTrew (Nov 7, 2007)

Eevee said:
			
		

> Java, C, and assembly are not the only languages on the planet D:
> 
> I didn't mention COBOL because I'm not familiar enough with its syntax beyond "lol".



Yeah, I'm not that familiar as well, but I know that people that know it are earning some good amounts of money working exclusively with the mainframe platform...

And I mentioned only Java, C, SQL and assembly because they are the languages that I have more contact with in my day-to-day routine... But off course there's lots more, I'll never ignore them :wink:


Also, sorry for my bad English...


----------



## Rostam The Grey (Nov 7, 2007)

COBOL is only used because it's so ingrained. I think I read an article that said it would take 10,000 people 20 years to rewrite all the COBOL that exists. Seriously though, it sux. It is very limited. I can do everything I need to on the mainframe in Easytrieve, Rexx, and SyncSort. And most of the COBOL code out there is CRAP or Spaghetti code. Everything I ever worked with in COBOL I rewrote. But that's not a good example because I tend to rewrite everything I do whether it be ColdFusion, .Net, or anything else.


----------



## Pi (Nov 7, 2007)

Eevee said:
			
		

> Java, C, and assembly are not the only languages on the planet  D:
> 
> I didn't mention COBOL because I'm not familiar enough with its syntax beyond "lol".



COBOL's syntax is pretty ... unique. It's got lots of built-in features for making selection menus and formatting numbers and using fixed-point arithmetic. But it's so very verbose, or can be depending on who writes it. You can tell it was designed by/for bureaucrats.


----------



## Eevee (Nov 7, 2007)

Oh yes.  ColdFusion.  There's a grand example of beautiful language design.


----------



## Pi (Nov 7, 2007)

Eevee said:
			
		

> Oh yes.  ColdFusion.  There's a grand example of beautiful language design.



I dunno. It seems at least somewhat consistent (cf php). It's just pretty gross to my eyes.


----------



## Rostam The Grey (Nov 7, 2007)

Eevee said:
			
		

> Oh yes.  ColdFusion.  There's a grand example of beautiful language design.



I find this comment full of lulz. ColdFusion is tag based and as simple a language to use as is possible. Want to run a query? Open a CFQUERY tag, write your query, close the tag. Viola, you have data. Want to loop through the data? Open a CFLOOP tag, load a few attributes, do some stuff in between, and close the tag. It doesn't get any simpler. I don't see any design issues here unless you mean the fact that it's sooo easy to write that many 'non-programmers' pound out crap code. But I don't see any of that, I've only heard the horror stories for CF. I've actually seen and worked with the COBOL.


----------



## Eevee (Nov 8, 2007)

Shoehorning a programming language into a _document markup syntax_.
Making your program look, by design, as much like its own output as possible.
Extra arbitrary syntax for variable interpolation.
The occasional requirement to put tags within attributes of other tags.

I can't really see a _good_ design decision anywhere in CF.  Sure, it's easy; so is Logo.  Sure, it's simple; so is Iota.


----------



## Rostam The Grey (Nov 8, 2007)

Eevee said:
			
		

> Shoehorning a programming language into a _document markup syntax_.
> Making your program look, by design, as much like its own output as possible.
> Extra arbitrary syntax for variable interpolation.
> The occasional requirement to put tags within attributes of other tags.
> ...



I think the syntax was a brilliant idea. My web pages look nothing like the output. They are a LOT smaller. Wha? And I don't even think it allows you to put tags in the attributes of other tags? Where did you get that? If you mean putting CF tags in HTML attributes, then yes, you can do that. But you don't have to.


----------



## DragonTrew (Nov 9, 2007)

Rostam The Grey said:
			
		

> COBOL is only used because it's so ingrained. I think I read an article that said it would take 10,000 people 20 years to rewrite all the COBOL that exists. Seriously though, it sux. It is very limited. I can do everything I need to on the mainframe in Easytrieve, Rexx, and SyncSort. And most of the COBOL code out there is CRAP or Spaghetti code. Everything I ever worked with in COBOL I rewrote. But that's not a good example because I tend to rewrite everything I do whether it be ColdFusion, .Net, or anything else.



Yeah, I don't say that it's by any means good... the thing is that it's ingrained as you said.. That's why some places still wanting people to work with it... And here in computer world there is an philosophy: "if the thing is working pretty well, don't even touch it". but some cases maintenance is needed and I don't think those people on Banks are interested in re-write all their codes, a failure on one of those codes would mean: lost millions of money (Dollar for you, Reais for me).

Never worked with CF though... :wink:


----------



## hawse (Nov 9, 2007)

DragonTrew said:
			
		

> Cobol people! You're missing Cobol!
> 
> Ok, Ok, it's not a desk/server programming language. But our goods MAINFRAMES still using it! Almost EVERY good money-based "industries" (like banks, stock exchange, and many others), will not forget about their mainframes that soon...
> 
> But I personally love to program in Java and C... Sql in on the go and assembler is "too much for my patience"...



OH! And don't forget Object Oriented Cobol, it's called "Add one to Cobol".

Hawse


----------



## Eevee (Nov 9, 2007)

Rostam The Grey said:
			
		

> My web pages look nothing like the output.


The code is _deliberately designed_ to have exactly the same basic syntax as the data surrounding it.


----------



## Pi (Nov 9, 2007)

Eevee said:
			
		

> Rostam The Grey said:
> 
> 
> 
> ...



I dunno, that might be considered a DSL.


----------



## Rostam The Grey (Nov 9, 2007)

DragonTrew said:
			
		

> Yeah, I don't say that it's by any means good... the thing is that it's ingrained as you said.. That's why some places still wanting people to work with it... And here in computer world there is an philosophy: "if the thing is working pretty well, don't even touch it". but some cases maintenance is needed and I don't think those people on Banks are interested in re-write all their codes, a failure on one of those codes would mean: lost millions of money (Dollar for you, Reais for me).
> 
> Never worked with CF though... :wink:



LOL, I basically work for a bank. At first they were very jumpy about me rewriting stuff. But now I've rewritten soo much that they trust me.


----------



## DragonTrew (Nov 9, 2007)

Rostam The Grey said:
			
		

> LOL, I basically work for a bank. At first they were very jumpy about me rewriting stuff. But now I've rewritten soo much that they trust me.



Hehehehe I see that they put some pretty confidence on your work!! And actually I have to agree that an upgrade is needed. I have friends that work directly with some IBM Z9 and they can tell it's some responsibility in the hands... It's good to know that we have people working to get things more reliable!


----------

