Answer by Ole Tange for split file into N pieces with same name but different...
Parallelized with GNU Parallel: parallel -j30 -a sourcefile.txt --pipepart --block -1 cat '>'prog{#}/myfile.txt This will run 30 jobs in parallel, splitting sourcefile.txt into one part per job...
View ArticleAnswer by jthill for split file into N pieces with same name but different...
Sed version for fun: lines=$(wc -l <sourcefile.txt) perfile=$(( (lines+29)/30 )) # see https://www.rfc-editor.org/rfc/rfc968.txt last=0 sed -nf- sourcefile.txt <<EOD $(while let...
View ArticleAnswer by αғsнιη for split file into N pieces with same name but different...
The awk only solution (N here equals 30 files): awk 'BEGIN{ cmd="wc -l <sourcefile.txt"; cmd|getline l; l=int((l+29)/30); close(cmd) } NR%l==1{trgt=sprintf("prog%d",((++c)))}{print...
View ArticleAnswer by ashishk for split file into N pieces with same name but different...
#!/bin/bash # assuming the file is in the same folder as the script INPUT=large_file.txt # assuming the folder called "output" is in the same folder # as the script and there are folders that have the...
View ArticleAnswer by RomanPerekhrest for split file into N pieces with same name but...
split + bash solution: lines=$(echo "t=$(wc -l ./sourcefile.txt | cut -d' ' -f1); d=30; if(t%d) t/d+1 else t/d" | bc) split -l $lines ./sourcefile.txt "myfile.txt" --numeric-suffixes=1 for f in...
View ArticleAnswer by Kramer for split file into N pieces with same name but different...
Steps count the lines in file and divide by 30 lines = cat ${file} | wc -l get the amount of files you need (bash will round it up to an integer) numOfFiles = ${lines} / 30 use split to divide the file...
View Articlesplit file into N pieces with same name but different target directories
I want to split sourcefile.txt which contains 10000 lines, (increasing everyday) into 30 equal files. I have directories called prog1 to prog30 and I would like to save split the file into these...
View Article