Questions tagged [scala]
Scala is a general-purpose programming language principally targeting the Java Virtual Machine. Designed to express common programming patterns in a concise, elegant, and type-safe way, it fuses both imperative and functional programming styles. Its key features are: an advanced static type system with type inference; function types; pattern-matching; implicit parameters and conversions; operator overloading; full interoperability with Java; concurrency
scala
112,614
questions
0
votes
0
answers
20
views
Input that .repeat() takes in Gatling simulation
I have a scenario where for array size I need to repeat some code piece in Gatling.
While .repat() works with integer provided for items.size.toInt , it does not honour the provided value and repeat ...
0
votes
1
answer
34
views
Spark-Scala vs Pyspark Dag is different?
I am converting pyspark job to Scala and jobs executes in emr. The parameter and data and code is same. However I see the run time is different and so also the dag getting created is different. Here I ...
0
votes
1
answer
46
views
Output is not giving as Range(1,2,3,4) for val from1Until5 = 1 until 5 println(s"Range from 1 until 5 where 5 is excluded = $from1Until5")
I am executing
println("\nStep 2: Create a numeric range from 1 to 5 but excluding the last integer number 5")
val from1Until5 = 1 until 5
println(s"Range from 1 until 5 where 5 is ...
0
votes
1
answer
31
views
nested case class generic filter method
I have some nested case classes that may look like the following:
case class Lowest(intVal: Int, stringVal: String)
case class Mid (lowestSeq: Seq[Lowest])
case class High (midSeq: Seq[Mid])
So if ...
1
vote
0
answers
62
views
How to address the performance impact of executing multiple functions dynamically compiled with Scala 2's runtime reflection?
I have an entity that generates Scala functions compiled dynamically using Scala 2's runtime reflection. My case is slightly more convoluted, but the following showcases the issue:
import scala....
0
votes
1
answer
34
views
Create Flow[ByteString, ByteString, NotUsed] by piping through InputStream
I need decompress data using compression not supported by akka, but supported by other library which provided InputStream interface.
To make it work with akka stream, I need to implement function:
def ...
-1
votes
0
answers
19
views
How to run MySQL scripts when I run docker compose? [duplicate]
I have a Scala backend and Javascript frontend applications and MySQL database. I want to create docker images for every one of them with one docker-compose.yml file and I want to run the scripts for ...
1
vote
0
answers
34
views
Scala 3 reflection: collect all members that are singleton objects
I have a method in Scala 2 that uses reflection. It doesn't work in Scala 3, and I would like to reimplement it so it works for Scala 3. (I will no longer need it for Scala 2 anymore, so it need not ...
1
vote
0
answers
29
views
Chisel Template Not Functioning Windows 11 - Cannot run program "which": CreateProcess error=2, The system cannot find the file specified
After having cloned the template project for Chisel (https://github.com/chipsalliance/chisel-template) I tried running sbt test, and I got the following error in the file /src/test/scala/gcd/GCDSpec:
&...
1
vote
1
answer
49
views
multiple pattern matching on iteration
val subjectpairs = IndexedSeq((8,0),(3,4),(0,9),(6,1))
val priors = subjectpairs.map { case( f,_) => f}.filter{ _ >0.0}
val posts = subjectpairs.map { case( _,l) => l}.filter{ _ >0.0}
...
1
vote
0
answers
19
views
Encrypt Spark Libsvm Dataframe
I have a libsvm file that I want to load into Spark and then encrypt it. I want to iterate over every element in the features to apply my encrypt function, but there doesn't seem to be any way to ...
0
votes
1
answer
17
views
Adding new Rows to Spark Partition while using forEachPartition
I am trying to add a new Row to each Partition in my Spark Job. I am using the following code to achieve this:
StructType rowType = new StructType();
rowType.add(DataTypes.createStructField("...
0
votes
0
answers
25
views
IntelliJ Scala warning with parameterized type in compiled JAR
I have Scala code with parameterized types like this:
val foo: SomeParameterizedType[_ <: Product with Serializable] =
new SomeParameterizedType[Foo](arguments)
val bar: SomeParameterizedType[_ &...
1
vote
0
answers
30
views
Unable to connect to MongoDB replicaset from scala mongodb driver
I am running mongodb replicaset in docker on localhost and I am able to connect to it from both command line and Mongodb Compass. When I try to connect from code I get Connection Refused. I am unable ...
0
votes
0
answers
28
views
Scala can't find imported file from same directory
I'm fairly new to scala, having troubles compiling my code. So I have a project structure like this
topdir/
com/
name/
final/
file1.scala
file2.scala
Both files are in the same package ...
0
votes
0
answers
35
views
IntelliJ Scala 2.13.0 no longer compiles - Error compiling the sbt component 'compiler-bridge-2.13.0-66.0'
I am working on a hobby project on my desktop using Scala 2.13.0 and JDK 22. The project compiles and runs perfectly fine there. This project is also on my personal GitHub. Because I went on holiday ...
0
votes
0
answers
21
views
Scala Spark Dataframe creation from Seq of tuples doesn't work in Scala 3, but does in Scala 2
When trying to test something locally with Scala Spark, I noticed the following problem and was wondering what causes it, and whether there exists a workaround.
Consider the following build ...
-1
votes
0
answers
42
views
Using spark 3.4.1 lib in Java when extending StringRegexExpression to a java class
I am using spark 3.4.1 in maven project where I am configured scala (2.13.8) lang as well. I am trying to create a class Like.java in project by extending spark's StringRegexExpression
package com....
0
votes
0
answers
40
views
Scala project: sbt compile generates empty fs2-grpcs folder, missing proto-generated classes
I'm working on a Scala project that currently implements a simple REST API. I'm trying to add a new gRPC service to this project. However, I'm encountering an issue during compilation.
Issue:
When I ...
1
vote
1
answer
32
views
How to resolve: java.lang.NoSuchMethodError: 'scala.collection.Seq org.apache.spark.sql.types.StructType.toAttributes()'
Running a simple ETL PySpark job on Dataproc 2.2 with job property spark.jars.packages set to io.delta:delta-core_2.12:2.4.0 . Other settings are set to default. I have the following config:
conf = (
...
1
vote
1
answer
53
views
Scala Fs2: Aggregate computation on infinite streams
I cant seem to understand how to perform aggregate computations on infinite streams. Taking an infinite stream of elements and performing a computation on each one individually is easy, but collecting ...
0
votes
1
answer
66
views
Azure Synapse Analytics - output not displayed
Following Scala code in Azure Synapse should print: Hello, World!. But instead, it prints: defined object Geeks.
Question: What could be the issue and how can we fix it?
object Geeks {
// Main ...
1
vote
1
answer
39
views
Scala failed to install on Windows 10
I used this site to install Scala on latest version of Windows 10 Pro. The installer started and successfully completed the following steps (shown at the end below). At the end, command window asked: ...
2
votes
1
answer
43
views
Scala f interpolator formatting for Java LocalDate
I'd like to be able to format strings based on Java.LocalDate using the f interpolator, i.e. something like:
val = LocalDate.of(2024,7,28)
f"$dt:YYYY-MM-dd"
But this will give a compile ...
1
vote
0
answers
38
views
Casting to a Member Type in Scala
I'm using the Apache POI library 5.2.3 in Scala to create a line chart in a PowerPoint presentation.
// createData returns XDDFChartData
var chartData = chart.createData(
ChartTypes.LINE,
...
0
votes
0
answers
24
views
Scalatest 3.0.8 PrivateMethodTester can't find private method named X
I have a base class that has a protected method called "validateOptPwd". This gets inherited in a test and then called via PrivateMethodTester.
val validateOptPwdMethod: PrivateMethod[Future[...
1
vote
1
answer
36
views
uPickle ReadWriter for joda DateTime fails
For a project, i need JSON representation for a org.joda.time.DateTime. For JSON, i use uPickle. As there is no implicit ReaderWriter[DateTime], i have to write it myself. After googleing around, i ...
-1
votes
0
answers
41
views
How can I find if the current fiber is running in the primary or blocking thread pool [duplicate]
My app runs thousands of small tasks in parallel using foreachPar(){}.withParallelism(85). The app runs on a machine with 100 CPUs. Some tasks finish in a few milliseconds, while the others finish in ...
1
vote
1
answer
27
views
Can I use same SparkSession in different threads
In my spark app I use many temp views to read datasets and then use it in huge sql expression, like that:
for (view < cfg.views)
spark.read.format(view.format).load(view.path).createTempView(view....
-1
votes
2
answers
47
views
Need help to convert each element of a list to dictionary
I have a list in Scala, where the structure of the list is:
['A:abc','B:hgfff', 'C':khfas' ...] it's a big list.
I want to convert it to a dictionary like:
d={'A':'abc','B':'hgfff','C':'khfas'}
Kindly ...
1
vote
0
answers
27
views
Spark scala transformations
I have spark input dataframe like below.
Emp_ID
Cricket
Chess
Swim
11
Y
N
N
12
Y
Y
Y
13
N
N
Y
Need Out Dataframe like below.
Hobbies
Emp_id_list
Cricket
11,12
Chess
12
Swim
12,13
Any way to ...
0
votes
1
answer
34
views
How to provide a generic Circe Decoder for a Scala 3 Enum Values?
In my model I have a subset of Enum Values (using refined types). So I need to provide Decoders for these Enum Values.
I have now the following version that works (I simplified the example with only ...
0
votes
0
answers
49
views
Unable to mock Scala object
I have started working in Scala very recently. I have below code in the main method that I'll need to mock for writing test cases,
object SacSAAgentSoftwareLogParser extends SparkSessionCreation {
...
-1
votes
0
answers
26
views
udf to transform a json string into multiple rows based on first level of nesting
I am trying to transform a df based on the first level nesting in the json string.
input dataframe
+------+------------------------------------+---------------------------------------------------------...
0
votes
1
answer
51
views
spark.sql() giving error : org.apache.spark.sql.catalyst.parser.ParseException: Syntax error at or near '('(line 2, pos 52)
I have class LowerCaseColumn.scala where one function is defined as below :
override def registerSQL(): Unit = spark.sql(
"""
|CREATE OR REPLACE TEMPORARY ...
1
vote
0
answers
50
views
SAML Login with Scala application as SP with pac4j-saml
I have a scala service which would act like the SP and would like to provide users to authenticate using SAML, for which I am using pac4j-saml version 6.0.2. The IdP that I am using is keycloak.
I ...
-1
votes
0
answers
59
views
Scala/SBT/Java - When running app locally with java 21 getting InaccessibleObjectException
Hi all I am using scala version 2.12.18, sbt version 1.9.0 and java jdk 21.0.3. When trying to run my application locally through
sbt run -Dconfig.resource=local-qa.conf
I am getting this error:
[...
0
votes
1
answer
51
views
Scala 3: Nonexhaustive match for Vector, but not for Seq
Using Scala 3.4.2:
Vector(1, 2, 3) match
case Vector() => println("empty")
case _ :+ last => println(s"last: $last")
gives me an (in my opinion incorrect) ...
0
votes
1
answer
29
views
Hot Reloading Scala.js
How to hot reload in scala.js application while in development
I am manually reloading the Scala.js application during development. However, I need the ability to reload and immediately reflect the ...
1
vote
0
answers
47
views
Get compiler to notice exhaustive match on Vector
The following code defines an exhaustive match over a Vector, v.
def testExhaustiveness(v: Vector[Int]) = {
v match {
case Vector() => println("v is empty")
case ns :+ n => ...
0
votes
1
answer
59
views
+50
How to create data-frame on rocks db (SST files)
We hold our documents in rocks-db. We will be syncing these rocks-db sst files to S3. I would like to create a dataframe on the SST files and later run an sql. When googled, I was not able to find any ...
0
votes
0
answers
43
views
How to force pull in fs2.Stream
I created an fs2.Stream with:
topic <- Stream.eval(fs2.concurrent.Topic[F, Either[Throwable, AmqpMessage[Array[Byte]]]])
messages <- Stream.resource(topic.subscribeAwait(innerQueueSize))
...
2
votes
1
answer
100
views
Test/compile not finding lib/jar with Scala 3 (but 2.13 works)
I'm packaging jars from generated source code with an sbt-plugin and the jars are fine and on the classpath.
Now I have two identical minimal projects targeting Scala 2.13 and 3.3. With 2.13 I can run ...
1
vote
0
answers
48
views
Akka to Pekko Migration and Grpc
Heeey all,
I'm doing a Akka to Pekko migration of a pet project (I'm learning scala atm). So far everything went smooth but there is one thing that I can't get fixed and the grpc plugin.
In my build....
0
votes
1
answer
39
views
Scala 3 binary compatibility issue
As I am moving to help my organization adopt scala 3, i am trying to map out the potential chanllenge to look after.
Generally, i was readying about binary compatibility change with scala 3 and ...
1
vote
1
answer
48
views
Scala Future strange behavior
What's wrong with this code? Why I can see only one output? What returns futureUserByName function? onComplete doesn't work for it as well. It must be just simple Future, but it doesn't work.
import ...
0
votes
0
answers
22
views
Flattening nested json with back slash in apache spark scala Dataframe
{
"messageBody": "{\"task\":{\"taskId\":\"c6d9fb0e-42ba-4a3e-bd39-f2a32a6958c1\",\"serializedTaskData\":\"{\\\"clientId\\\":\\\&...
0
votes
0
answers
35
views
Jetty 12 with Scala "Unable to locate class corresponding to inner class entry for Scope"
After upgrading jetty-server from 11 to 12, the compilation (using mill as the build tool) fails with:
Unable to locate class corresponding to inner class entry for Scope in owner org.eclipse.jetty....
1
vote
1
answer
62
views
Scala 3 macros: "dynamically" instantiating singleton objects at compile-time
I'm trying to create a macro that makes use of some objects.
Suppose I have the following definitions:
trait Foo:
def doStuff(): Unit
// in other files
object Bar extends Foo:
def doStuff() = ......
0
votes
0
answers
76
views
Importing libraries with same names
I have two libraries that both follow the same names for importing, but one has more functionality than the other. How do I make scala choose the right library to use when importing? Currently, it’s ...