The forward posed this question and by and large, i agree with the author's point but ill add one more. I accuse Haskell of ignoring physics. e.g what is the type for all my data is stored in one building and so if the network goes down my customers can't get their information? Is the leaky abstraction problem?
When writing languages with poor support for function composition I find myself constantly coming up with terrible variable names for intermediate values that I only care about as a means to an end
I once had an algebra instructor who wished that functions were written to the right of their arguments.
So (x)f would mean plug x into f.
And, in that world Haskell would probably look like
arg3 arg2 arg1 func
What's doubly interesting is how close this looks to Forth code.
The problem is that these kind of piplines are nice til you need multiple arguments. Then in concatenative langs you're duping and swaping and whatnot all the time
My example is still application based (instead of concatenative), it's just that functions are to the right of their arguments and function application is right-associative (just like types are)
If your native language reads left to write then its a bridge you have to cross. (to get to anything right to left)
If your group is heterogenous its unlikely the operations wil be performed uniformly over all the elements. What you want at that point is a way to easily query/separate/isolate the groups.
I believe in haskell this is a lens? but filter is a lightweight version of this.
If your group is heterogenous its unlikely the operations wil be performed uniformly over all the elements. What you want at that point is a way to easily query/separate/isolate the groups.
The problem is that these kind of piplines are nice til you need multiple arguments.
Pipelines are nice because they imply you don't have multiple types in the collection your operation over. As in, everything will be treated the same way. It's easier on us humans to separate things into different groups, then perform a pipeline of transformation on them. Rather than having one big reduction over the set and an action that does all the work.
This is why people like to seperate GET TRANSFORM and SET.
Right, theoretically you ought to be able to write any non-branching program as a pipeline if you do enough tupling and untupling. It's just extremely unwieldy
Yeah, absolutely - I'm not saying that pipelines are good as a general style of programming - just that they work well for single value transformations, which are pretty common
The forward posed this question and by and large, i agree with the author's point but ill add one more. I accuse Haskell of ignoring physics. e.g what is the type for all my data is stored in one building and so if the network goes down my customers can't get their information? Is the leaky abstraction problem?
I would say that example is actually great fit for Haskell - I could define
where
BackedUp
is opaque datatype that users can't create themselves and provide somewhich force users to only manipulate my value through interface that handles backing up automatically
My friend said the problem is that haskellers write programs backwards
I tried to point out to him that he could just define and use
but he wasn't having it
I guess convention still applies though - but I wonder whether he finds
better for some reason :stuck_out_tongue:
Right, but a non functional programmer would do, like
or something
maybe
idk
So the answer is to "avoid" the problem :big_smile:
When writing languages with poor support for function composition I find myself constantly coming up with terrible variable names for intermediate values that I only care about as a means to an end
Right haha
(BTW, there's
in
Data.Function
)yeah, I'm familiar, I like Flow
I once had an algebra instructor who wished that functions were written to the right of their arguments.
So (x)f would mean plug x into f.
And, in that world Haskell would probably look like
What's doubly interesting is how close this looks to Forth code.
Just for consistency of operator names
(|>) (<|) (.>) (<.)
Right, the concatenative paradigm
Or Function-level programming
I'm sympathetic to left to right. I like
The problem is that these kind of piplines are nice til you need multiple arguments. Then in concatenative langs you're
dup
ing andswap
ing and whatnot all the timeyou can do stuff like
But I hate that
And thus we come full circle and arrive to Arrow notation :joy:
(JK, this is actually meant for composing more restricted things than functions)
TheMatten said:
My example is still application based (instead of concatenative), it's just that functions are to the right of their arguments and function application is right-associative (just like types are)
I've tried to learn arrows a few times, but I haven't been able to grok it yet
If your native language reads left to write then its a bridge you have to cross. (to get to anything right to left)
If your group is heterogenous its unlikely the operations wil be performed uniformly over all the elements. What you want at that point is a way to easily query/separate/isolate the groups.
I believe in haskell this is a lens? but filter is a lightweight version of this.
(@Daniel Bramucci I meant to answer to @James Sully - sorry if it wasn't clear)
@Daniel Bramucci ah ok. I'm not familiar enough with concatenative to understand the operational difference
Oh wait no, I see
nevermind
You mean in relation to function composition?
As an interesting side note:
In today's Haskell, functions types are right-associative but function application is left associative
In postfix function world, we would have both be right-associative
I mean, its conceptually easier to isolate out a group then perform an action. Rather than to have an action that handles every group.
Because if someone asks you what that action does, your like "um, everything"
Sorry, I'm not sure which part of discussion this applies to :sweat_smile:
@James Sully Just in case
Haskell assumes by default that
Function-level programming
gives you function composition by default
And concatenative programming does something like function composition, but it's not exactly the same.
@TheMatten It relates back to this comment.
The problem is that these kind of piplines are nice til you need multiple arguments.
Pipelines are nice because they imply you don't have multiple types in the collection your operation over. As in, everything will be treated the same way. It's easier on us humans to separate things into different groups, then perform a pipeline of transformation on them. Rather than having one big reduction over the set and an action that does all the work.
This is why people like to seperate GET TRANSFORM and SET.
Right, theoretically you ought to be able to write any non-branching program as a pipeline if you do enough tupling and untupling. It's just extremely unwieldy
Yep. Which is why I'm such a fan of datomic. As it's untuples with a minmal set of instructions all the way from a persistent store.
But I might need to make another channel for that topic.
Yeah, absolutely - I'm not saying that pipelines are good as a general style of programming - just that they work well for single value transformations, which are pretty common