Questions tagged [awk]
A pattern-directed scanning and processing language.
8,038
questions
1
vote
2
answers
48
views
Need to make multiple multiline replacements in file
I have a file with multiple "paragraphs" like this:
<type>TRANS</type>
<attributes/>
<specification_method>rep_name</specification_method>
<trans_object_id/&...
0
votes
0
answers
55
views
cleaning up full directories at once
right now, i have a filesystem that fills up fairly regularly. the TL;DR is that I can't introduce better log rotation, and the files must be deleted regularly.
let's say i have the following output ...
3
votes
4
answers
500
views
Cut and replace every Nth character on every row
I have predictable piped input, I want to iterate over each row and change two characters. the character positions are 19 and 20 (on every row.) The 19th character is a comma, I want to cut that. The ...
-2
votes
1
answer
61
views
regular single column data to multiple rows
source data
KKK-SNMImsi: 444444
KKK-SNMUserProfileId: KKK-SNMDefaultAutomaticProfile
KKK-SNMMmeAddress: TDSGSN01
KKK-SNMMsisdn: 44235682436
KKK-SNMLocationState: LOCATED
KKK-SNMRoamingAllowed: FALSE
...
0
votes
3
answers
58
views
Pass output of command line to awk variable
I am trying to normalise a data file using the number of lines in a previous version of the data file. After reading these questions, I thought this could work:
awk -v num=$(wc -l my_first_file.bed) '{...
1
vote
1
answer
82
views
Print csv columns according to user input
I have a .csv file with multiple lines with the following format:
¬Country¬,¬Year¬,¬Singer¬,¬Song Title��
¬Japan¬,¬1999¬,¬Utada Hikaru¬,¬First Love¬
¬South Korea¬,¬1999¬,¬Lee Jung Hyun¬,¬Wa¬
...
I can ...
-2
votes
4
answers
135
views
Getting output of /usr/bin/time in CSV format
I am using gawk to parse the output of macos' /usr/bin/time into CSV format as shown below. The problem is that gawk is returning the 'involuntary context switches' value for 'voluntary context ...
-4
votes
3
answers
116
views
Pattern match using awk
I have an input file as below
Col1|Col2|Col3|Col4
abc123|654565|abc|apple
abc123|654565|pir|orange
abc123|654565|val|plum
I want to select all the lines that have Col2=654565 and Col3=abc and print ...
1
vote
3
answers
72
views
Bash: sum a number present in N lines before a given pattern
I have a kdenlive project file (which is basically a xml file) with a lot of text clips inside. I need to bulk edit the x coordinate of a certain text that appears multiple times. This is an example:
...
-1
votes
1
answer
52
views
how to omit/ignore/remove lines starting with local/ using awk
Variable _RESULT contains an output of pacinfo --verbose local/jre.
Using awk script (by Ed Morton - thanks to him!):
awk '{
match($0,/:[[:space:]]*/)
nextTag = substr($0,1,RSTART-1)
...
0
votes
3
answers
94
views
Using sed or awk, how can i delete a line whenever the next line begins with the same content followed by a slash?
Using sed or awk, how can I delete a line whenever the next line begins with the same content followed by a slash?
0
votes
1
answer
56
views
Strip a line from a variable using awk or sed
There is a Zsh variable with contents like
$if(description-meta)$
<meta name="description" content="$description-meta$" />
$endif$
<title>$if(title-prefix)$$title-...
-2
votes
1
answer
61
views
Parse txt file on basis of occurrence of a tag in Linux
I am trying to parse a txt file containing xml "messages"in linux, something like this
<Document abc xyz .....> <hji> xyz </hji> </Document> <Document abc xyz ........
2
votes
3
answers
340
views
Extract Value from Kafka-Topics Command
Given the following output, how can I extract retention.ms's value of 3333?
Ultimately I'd like to print out topic has retention of 3333 milliseconds from the output of the kafka-topics.sh ... command:...
-1
votes
2
answers
104
views
how to de-duplicate block (timestamp+command) from bash history?
I'm working with bash_history file containing blocks with the following format: #unixtimestamp\ncommand\n
here's sample of the bash_history file:
#1713308636
cat > ./initramfs/init << "...