-
Notifications
You must be signed in to change notification settings - Fork 15
7ML7W Julia Day 2: Getting Assimilated
Where is everyone?
Here they are!
We started talking about the fact that it is summer and that the book might not be living up to expectations. Is this book as good as the original 7 langs in 7 weeks? We wonder...
The metaphor for this language is "The Borg" - so this is why we're getting assimilated etc.
We're onto control flow in Day 2. Is there a reason we couldn't do this in day 1?
The Julia designers went for elseif rather than elsif. They also didn't go for 'else if' making it less C-like?
There is no truthy/falsey like we have in other languages, such as Ruby. Test statements must be a bool.
We think that apparent coercion when adding a float to an array of ints is maybe being done by the implementation of the + function rather than actual type coercion.
[Static typing detour]
for a = 1:10 ...
is this equals here uncommon? It's a bit like Go:
for a := range array {...
Equals and in appear to be interchangeable
for a = 1:10 ...
for a in 1:10 ...
It seems this is only for for loops? e.g. the following doesn't work:
1 = 1
=> error
1 in 1
=> true
Computer Scientist Chris Patuzzo: "it's interesting but it's just syntax". Moving on...
Equals
Julia-lang: "make types great again!"
The default type is Any. So here name is an Any type.
type MovieCharacter
heart :: Bool
name
end
Detour into Any vs Datatype, which is the top? ... Any seems to be the actual top.
It's not possible to subtype more than one level. Or it appears that you can only subtype abstract types.
Tom: (paraphrasing) "In the tree of types, only the leaves are concrete types, everything else is an abstract type."
(We can only instantiate concrete types.)
function foo(bar)
println("$bar")
end
Arguments are Any by default. You can specify types if you'd like:
function foo(bar :: Int64)
println("$bar")
end
Julia has multiple dispatch. It chooses from implementations of different functions based on their type signatures, not just the type of the first argument (which is often the object/thing you're calling it on).
So rather than just looking at the receiver to find the function it looks up functions based on all the arguments.
Which is chosen?
function with_defaults(a, b :: Int64)
print(1)
end
function with_defaults(a :: Int64, b)
print(2)
end
=> Error, method with_defaults is ambiguous (blah blah blah)
This is apparently useful for Scientific computing where you might want to have lots of different implementations for different types of numbers, for example.
'Method' seems used to mean the specific implementation of the function being used. So one function has many methods?
[short pause for Apply TV and coffee machine cleaning routine]
function foo(a) end
=> foo (generic function with 1 methods)
function foo(a, b) end
=> foo (generic function with 2 methods)
function foo(a, b, c) end
=> foo (generic function with 3 methods)
In summary, in Julia you can create many definitions with different types and Julia will pick the right one.
How similar to this Elixir's pattern matching? After looking it up, it seems Elixir also has multiple dispatch. Interestingly, the book doesn't compare this to Elixir.
[detour into multiple dispatch support in different languages]
Murray: "It seems like multiple dispatch is a good way of doing things if we have types but don't have objects."
Tom: (paraphrasing again) "It allows you define functions with in a comparable way to how you might if you were writing mathematical notations and without the Object Oriented overhead."
Give Julia extra workers using the -p argument followed by the level of concurrency desired.
julia -p 8
It seems that Julia creates the workers as forked child processes. They send messages, so presumable some shared memory exists...
To get a worker to do a thing, you first get a handle to the remote result:
r1 = remotecall(2, rand, 100)
=> RemoteRef(2,1,7)
This tells worker 2 to call rand with the arg 100.
To get the result, which is blocking:
rand_list = fetch(r1)
=> 100-element Array{Float64,1}: ...
This is a bit micromanagmenty. There is a parallel macro to distribute work across all workers (doesn't create new ones):
function pflip_coins(times)
@parallel (+) for i = 1:times
int(randbool())
end
end
In comparison, the imperative version:
function flip_coins(times)
count = 0
for i = 1:times
count += int(randbool())
end
count
end
[detour into Scottish maths teaching and what that might look like]
The jury is out as to whether or not the parallel version is more readable but it seems simple enough.
[detour into the quality of Ready Player One (movie and audio book)]
pmap
. Is a function that takes another function and a list of arguments. Then
farms that work to lots of different workers. Pretty easily.
hist = pmap((len -> repeat("*", int(len))), bars)
Murray: (paraphrasing) "I like the interviews. I'd read a book of interviews with programming language creators"
Takeaway is the that Julia's developers can break things, or at least aren't afraid to. In comparison to commercial languages which have a far more cautious attitude to design changes.
Chris: (paraphrasing) 'it makes things more confusing to have types as values'
[Discussion about how usually types operate on a different level/plane/universe to 'normal' values.]
[short detour into Mersenne Twister and its quality.]
@everywhere
runs a command on all processes. Useful to include a library.
[detour into bricks, wifi and 'fast air']
@spawn
makes the remote call interface easier. It creates a process to run
the code and takes an expression that it presumably deconstructs and turns into
functions with arguments like we saw earlier.
[Technical Difficulties]
All: Yay! (loudly)
What are Julia macros? Do they operate on the AST? If not, are they even macros?
[some time spent looking into the mechanics of @spawn that I don't really know how to write up nicely]
Murray: "I think it's unknowable" Chris: "Let's just forget about it"
Elena: "Shall we do a for
" (that goes backward(s))
Turns out we can get a reverse range like this:
for i in 10:-1:1
println(i)
end
Tom: "That is literally the best thing we have done, ever."
Negative step
Good:
- Good snax, we like red hummus. Lots of bread.
- We learned about multiple dispatch
- We've got two left - good thing?
[Idris Elba detour]
Tom: "Doesn't everyone die in Infinity War?"
Elena: "Tower bridge has got Jeff Goldblum in it"
Less Good:
- Murray, not enough Jeff Goldblum
- Not as much meat/interest in the chapter. We're needing to look pretty hard.
- Does it all happen in day three.
- Could we compare Julia to R? Would that have been more interesting?
R seems to be data science, data viz. Julia seems to be more for mathematical work, but is also a faster alternative.
Improve on:
- Do we want to do the final Julia Day?
- Ask the question about finishing the book after the Julia chapters.
- Shame we didn't choose the order that we go through the languages.
- Multiple dispatch seemed pretty uninteresting after just covering Elixir
- Perhaps do the final two languages in the order of our choice.
The book is really just starters really. After tapl, we've had more dipping in and out - which was the intended effect (mostly seems to be dipping out). Are we still getting the same value out of the book?
Perhaps we've swapped deeper content for dip-in-ability.
Can we perhaps do a language per meeting for the final two languages?
Thank you unboxed
- Home
- Documentation
- Choosing a Topic
- Shows & Tells
- Miscellaneous
- Opt Art
- Reinforcement Learning: An Introduction
- 10 Technical Papers Every Programmer Should Read (At Least Twice)
- 7 More Languages in 7 Weeks
- Lua, Day 1: The Call to Adventure
- Lua, Day 2: Tables All the Way Down
- Lua, Day 3
- Factor, Day 1: Stack On, Stack Off
- Factor, Day 2: Painting the Fence
- Factor, Day 3: Balancing on a Boat
- Elm, Day 1: Handling the Basics
- Elm, Day 2: The Elm Architecture
- Elm, Day 3: The Elm Architecture
- Elixir, Day 1: Laying a Great Foundation
- Elixir, Day 2: Controlling Mutations
- Elixir, Day 3: Spawning and Respawning
- Julia, Day 1: Resistance Is Futile
- Julia, Day 2: Getting Assimilated
- Julia, Day 3: Become One With Julia
- Minikanren, Days 1-3
- Minikanren, Einstein's Puzzle
- Idris Days 1-2
- Types and Programming Languages
- Chapter 1: Introduction
- Chapter 2: Mathematical Preliminaries
- Chapter 3: Untyped Arithmetic Expressions
- Chapter 4: An ML Implementation of Arithmetic Expressions
- Chapter 5: The Untyped Lambda-Calculus
- Chapters 6 & 7: De Bruijn Indices and an ML Implementation of the Lambda-Calculus
- Chapter 8: Typed Arithmetic Expressions
- Chapter 9: The Simply-Typed Lambda Calculus
- Chapter 10: An ML Implementation of Simple Types
- Chapter 11: Simple Extensions
- Chapter 11 Redux: Simple Extensions
- Chapter 13: References
- Chapter 14: Exceptions
- Chapter 15: Subtyping – Part 1
- Chapter 15: Subtyping – Part 2
- Chapter 16: The Metatheory of Subtyping
- Chapter 16: Implementation
- Chapter 18: Case Study: Imperative Objects
- Chapter 19: Case Study: Featherweight Java
- The New Turing Omnibus
- Errata
- Chapter 11: Search Trees
- Chapter 8: Random Numbers
- Chapter 35: Sequential Sorting
- Chapter 58: Predicate Calculus
- Chapter 27: Perceptrons
- Chapter 9: Mathematical Research
- Chapter 16: Genetic Algorithms
- Chapter 37: Public Key Cryptography
- Chapter 6: Game Trees
- Chapter 5: Gödel's Theorem
- Chapter 34: Satisfiability (also featuring: Sentient)
- Chapter 44: Cellular Automata
- Chapter 47: Storing Images
- Chapter 12: Error-Correcting Codes
- Chapter 32: The Fast Fourier Transform
- Chapter 36: Neural Networks That Learn
- Chapter 41: NP-Completeness
- Chapter 55: Iteration and Recursion
- Chapter 19: Computer Vision
- Chapter 61: Searching Strings
- Chapter 66: Church's Thesis
- Chapter 52: Text Compression
- Chapter 22: Minimum spanning tree
- Chapter 64: Logic Programming
- Chapter 60: Computer Viruses
- Show & Tell
- Elements of Computing Systems
- Archived pages