Questions tagged [user-defined-functions]
A function provided by the user of a program or an environment most often for spreadsheet type applications or database applications. Use [custom-functions-excel] for Excel and [custom-function] for Google sheets. Specify a programming language tag as well: [google-apps-script], [javascript], [sql], [tsql], etc. as well as a tag for the application: [excel], [google-spreadsheet], [sql-server] etc.
user-defined-functions
4,967
questions
1
vote
1
answer
44
views
Snowflake unsupported subquery type for UDTF
In Snowflake, I'm trying to save a large amount of code size and maintenance by creating a UDTF to show flags for each row v based on presence of rows in x or y meeting some pattern.
The error I get ...
0
votes
1
answer
51
views
spark.sql() giving error : org.apache.spark.sql.catalyst.parser.ParseException: Syntax error at or near '('(line 2, pos 52)
I have class LowerCaseColumn.scala where one function is defined as below :
override def registerSQL(): Unit = spark.sql(
"""
|CREATE OR REPLACE TEMPORARY ...
0
votes
0
answers
10
views
Plotting def function from research paper leads to errors
I am trying to plot this complex def function from a research paper that I read for a project and I keep getting that the map function is not iterable. When I run the code by itself, it's fine, but ...
0
votes
0
answers
24
views
Glue job with Bedrock not running in parallel
I am writing a Glue job to process a pyspark dataframe using Bedrock which was recently added to boto3. The job will get sentiment from a text field in the dataframe using one of the LLMs in Bedrock, ...
1
vote
1
answer
28
views
How do I override auto formats in {tern} functions?
Considering {tern} and {rtables} package formatting functions, can you override auto formats in {tern} functions like count_patients_with_flags?
For example:
library(rtables)
library(tern)
library(...
1
vote
3
answers
94
views
Union the results with loop to call function for each table row
I am pretty new to Azure Data Explorer (Kusto) queries. I have a stored function, that takes a string as a parameter, does some querying around that input and return a data table.
QueryFunc(input: ...
0
votes
3
answers
87
views
How to use in UDF the Evaluate function for an Excel array formula string longer than 255 character
For a time I use an UDF which is similar to LAMDA function, where the input formula is a string. This works always without any issue. Within the UDF the function string is processed with the Excel ...
0
votes
0
answers
67
views
UDF in Snowpark + DBT gives ModuleNotFoundError with group_by().apply_in_pandas()
I am using snowpark and DBT to do some data transformations and creating some views/table in Snowflake. My models have a structure like this one:
import pandas as pd
from snowflake.snowpark import ...
0
votes
1
answer
57
views
INSERT INTO SELECT subquery that uses a udf so that it returns 1 value
I'm attempting to create a new table but altering some of the data from my old table using a INSERT INTO SELECT statement. I'm trying to take a field with numbers and letters and pull out the numbers ...
1
vote
1
answer
56
views
PySpark builtin functions to remove UDF
I have a column whose value is:
"{""ab"": 0.7220268151565864, ""cd"": 0.2681795338834256, ""ef"": 1.0, ""gh"": 1.0, ...
0
votes
1
answer
45
views
Convert a python UDF to pandas UDF to improve performance in PySpark
I have multiple functions in Python that I am using as UDFs in PySpark, but the problem is that my data is too big, and applying all these UDFs takes a long time to complete the transformation. The ...
0
votes
0
answers
32
views
python3.11 Aead anonymization Pickle serializer error
I am doing a program to anonymize some columns on a dataframe using the tink library to do it, but I am getting a serialize error in daead object in an udf that I had wrote
I rewrite all the program ...
0
votes
0
answers
49
views
Snowflake Unsupported subquery type cannot be evaluated in UDF
I am trying to create a function that will take an h3id as input and returns the zip5 which the h3id belongs to, In order to do so I have created this UDF:
CREATE FUNCTION get_zip_from_h3(h3_id ...
2
votes
1
answer
79
views
pyspark - explode a dataframe col, which contains json
I have a column 'user_contacts_attributes ' in spark dataframe:
+------------------+-------------------------------------+
| user_name |user_contacts_attributes |
+----------------...
1
vote
0
answers
41
views
use java udf in pyspark, raise type exception
Step 1: Define a scala UDF:
import org.apache.spark.sql.api.java.UDF1
import scala.collection.mutable
class GetMidVal extends UDF1[mutable.WrappedArray[Double], Double] {
override def call(arr: ...