Questions tagged [scala]
Scala is a general-purpose programming language principally targeting the Java Virtual Machine. Designed to express common programming patterns in a concise, elegant, and type-safe way, it fuses both imperative and functional programming styles. Its key features are: an advanced static type system with type inference; function types; pattern-matching; implicit parameters and conversions; operator overloading; full interoperability with Java; concurrency
scala
112,581
questions
0
votes
0
answers
16
views
Scala 3: Nonexhaustive match for Vector, but not for Seq
Using Scala 3.4.2:
Vector(1, 2, 3) match
case Vector() => println("empty")
case _ :+ last => println(s"last: $last")
gives me an (in my opinion incorrect) ...
0
votes
0
answers
16
views
Hot Reloading Scala.js
How to hot reload in scala.js application while in development
I am manually reloading the Scala.js application during development. However, I need the ability to reload and immediately reflect the ...
0
votes
0
answers
31
views
Get compiler to notice exhaustive match on Vector
The following code defines an exhaustive match over a Vector, v.
def testExhaustiveness(v: Vector[Int]) = {
v match {
case Vector() => println("v is empty")
case ns :+ n => ...
0
votes
1
answer
25
views
How to create data-frame on rocks db (SST files)
We hold our documents in rocks-db. We will be syncing these rocks-db sst files to S3. I would like to create a dataframe on the SST files and later run an sql. When googled, I was not able to find any ...
0
votes
0
answers
26
views
How to force pull in fs2.Stream
I created an fs2.Stream with:
topic <- Stream.eval(fs2.concurrent.Topic[F, Either[Throwable, AmqpMessage[Array[Byte]]]])
messages <- Stream.resource(topic.subscribeAwait(innerQueueSize))
...
1
vote
0
answers
57
views
+500
Test/compile not finding lib/jar with Scala 3 (but 2.13 works)
I'm packaging jars from generated source code with an sbt-plugin and the jars are fine and on the classpath.
Now I have two identical minimal projects targeting Scala 2.13 and 3.3. With 2.13 I can run ...
0
votes
1
answer
33
views
Scala 3 binary compatibility issue
As I am moving to help my organization adopt scala 3, i am trying to map out the potential chanllenge to look after.
Generally, i was readying about binary compatibility change with scala 3 and ...
1
vote
1
answer
39
views
Scala Future strange behavior
What's wrong with this code? Why I can see only one output? What returns futureUserByName function? onComplete doesn't work for it as well. It must be just simple Future, but it doesn't work.
import ...
0
votes
0
answers
20
views
Flattening nested json with back slash in apache spark scala Dataframe
{
"messageBody": "{\"task\":{\"taskId\":\"c6d9fb0e-42ba-4a3e-bd39-f2a32a6958c1\",\"serializedTaskData\":\"{\\\"clientId\\\":\\\&...
0
votes
0
answers
28
views
Jetty 12 with Scala "Unable to locate class corresponding to inner class entry for Scope"
After upgrading jetty-server from 11 to 12, the compilation (using mill as the build tool) fails with:
Unable to locate class corresponding to inner class entry for Scope in owner org.eclipse.jetty....
1
vote
1
answer
56
views
Scala 3 macros: "dynamically" instantiating singleton objects at compile-time
I'm trying to create a macro that makes use of some objects.
Suppose I have the following definitions:
trait Foo:
def doStuff(): Unit
// in other files
object Bar extends Foo:
def doStuff() = ......
0
votes
0
answers
74
views
Importing libraries with same names
I have two libraries that both follow the same names for importing, but one has more functionality than the other. How do I make scala choose the right library to use when importing? Currently, it’s ...
0
votes
0
answers
25
views
Comparing impact of Scala PlayFramework's db.withConnection and db.withTransaction methods on the underlying postgres database
Given that there is just one simple SQL statement (example: select id from foo) that I need to execute on my Postgres 15 database, I am curious to know the performance impact of executing it using ...
0
votes
0
answers
24
views
Spark : Read special characters from the content of dat file without corrupting it in scala
I have to read all the special characters in some dat file (e.g.- testdata.dat) without being corrupted and initialise it into a dataframe in scala using spark.
I have one dat file (eg - testdata.dat),...
0
votes
1
answer
29
views
Use Jedis `echo` in pipeline
The examples use Scala code, but the issue would be the same with Java.
Way back in version 2 of Jedis, you could use echo in a pipeline:
import redis.clients.jedis._
object Main {
def main(args: ...