Forums  > Software  > go and rust  
Page 1 of 1
Display using:  


Total Posts: 182
Joined: Mar 2011
Posted: 2018-09-13 00:57
At the risk of inciting a flame war, I would like to hear what NP'ers think about Rust and Go. Never have I heard so many people worth taking seriously say there is a realistic chance of moving away from C or C++.


Total Posts: 1134
Joined: Feb 2007
Posted: 2018-09-13 04:51
Mostly stale, but
I found Rust to be almost as awful as C++, and the benchmarks were terrible and contrived as they were with Julia (aka real world; multiply by 4 or worse).
Golang is fine for what most people use Java for.

Didn't see any reason to persist in either one of them, and generally wish I had invested more time in getting better at C (or fooling around in some C++17 dialect).

"Learning, n. The kind of ignorance distinguishing the studious."


Total Posts: 182
Joined: Mar 2011
Posted: 2018-09-14 08:09
jslade, what would you want to improve in your C writing?


Total Posts: 330
Joined: Jan 2015
Posted: 2018-09-14 11:25
Go was originally meant to be a C++ replacement, but everyone's pretty much concluded that it doesn't quite cut it as a systems language. It's something to use when you need something a little faster, closer to the metal, better at concurrency, or more prod-ready than python, JVM, or CLR. But you don't want to deal with the bullshit, mental complexity, verbosity, slow compile-times, platform-dependency, and shitty package management of C++.

Everything in systems world comes down to memory management, and at least at a conceptual level. Using a GC is a non-starter. Even if you make it optional, like D, nearly every single library will wind up having some sort of GC dependency. Rust's borrow checker is fundamentally the best approach, and is basically the direct application of modern PL theory.

That being said Rust may die from the Haskell curse. If a system is beautifly architected but cognitively inaccessible to the majority of programmers, it’ll never get widespread adoption. In particular validating complex data structures with the borrow checker can feel like abstract algebra.

C++17 is nearly on par with smart pointers and move semantics. But safety isn’t enforced by the compiler. (And the preprocessor makes third-party static analysis toothless.) Without that a large codebase is just going to naturally accrue undeclared unsafe sections. Which kind of defeats the purpose of having the guarantees in the first place. Plus even if you do religiously follow best practices, Rust still is safer in certain regards, in particular with concurrency and uninitialized data.

The biggest juncture will be how well modules are designed and received in C++20. The fact that we’re still using makefiles in 2018 is fucking insane. If modules actually work and are used, then we can finally get a decent package manager. It also means that compile times (which have been one of the biggest negatives of all the 0x++ features) will finally start getting faster.

Long-term my hope is that if WG21 gets modules right that’ll build enough buyin to deprecate the preprocessor entirely. With just AST and no text expansion, then enforcing/determining safety on large segments of code becomes much easier. Ultimately the preprocessor has been the most intractable design flaw from the original C++ spec. It’s really at the root of all evil we associate with the language.

Or maybe, we’ll just all end up using javascript for everything, right on down to the O/S

Good questions outrank easy answers. -Paul Samuelson


Total Posts: 5
Joined: Jul 2018
Posted: 2018-09-14 20:40
My team uses Rust in production, though we're still dominantly a C++ shop. There's maybe 4 things missing that prevent us from making the full migration:

1. Limited const value generics, or more generally, fully dependent pi types.
2. Limited CTFE or constexpr equivalent.
3. No variadic functions, or more generally, variadic generics.
4. Lack of SIMD facilities.

I'm not familiar with Go but the last I checked (2013?) it didn't cleanly support compile-time safe parametric polymorphism, which was no-go for us.


Total Posts: 17
Joined: Apr 2011
Posted: 2018-09-15 01:56
@EspressoLover Agree with you on the craziness of still using Makefiles, however I'm not sure I follow your complaint about how the preprocessor makes static analysis toothless. Surely now with LLVM, you can just compile your code to the IR and do all your static analysis on that?


Total Posts: 330
Joined: Jan 2015
Posted: 2018-09-16 00:13

That's a fair point, and the team at Clang has done some really impressive stuff with clang-analyzer and clang-tidy. But the nature of C++ is that it's taken some really smart people and some Herculean efforts to get something that falls well short of what you get out of the box with the Rust compiler.

Clang's static analysis runs much slower than the C compiler, produces hard to decipher output, makes a lot of false positives, and still doesn't cover a lot of what should be obvious issues. This is largely because of the preprocessor and the way C++ implements templates (which is philosophically basically an extension of the preprocessor).

From a pure functional perspective, you always can abstract the preprocessor. You don't even need to compile down to IR or even AST. You can just expand the text and SFINAE the templates. But practically now you have something disconnected from the original source. Naively you have to duplicate analysis on every single #include directive, which makes things run unusably slow. As well figure out which imported text represents "user code" and which represents "external code".

Now all of this shit can be mitigated by various clever tricks. And Clang pulls out all the stops. But at that point you're well beyond just treating the preprocessor as a black box. You're really getting into the weeds of routing around all the design mistakes of the preprocessor and templates. Essentially you're hacking a module system under the hood. And even then things aren't so simple, e.g. you #include the same file multiples times, with completely different effect depending on what's #define'd at directive time.

Good questions outrank easy answers. -Paul Samuelson
Previous Thread :: Next Thread 
Page 1 of 1