Running out of memory with MikTex The 2019 Stack Overflow Developer Survey Results Are...
Ubuntu Server install with full GUI
How to translate "being like"?
Output the Arecibo Message
Is there a way to generate a uniformly distributed point on a sphere from a fixed amount of random real numbers?
The phrase "to the numbers born"?
What do hard-Brexiteers want with respect to the Irish border?
Can you cast a spell on someone in the Ethereal Plane, if you are on the Material Plane and have the True Seeing spell active?
Pokemon Turn Based battle (Python)
Worn-tile Scrabble
Can withdrawing asylum be illegal?
How to notate time signature switching consistently every measure
Why doesn't UInt have a toDouble()?
Did Scotland spend $250,000 for the slogan "Welcome to Scotland"?
Falsification in Math vs Science
How to type a long/em dash `—`
Button changing its text & action. Good or terrible?
writing variables above the numbers in tikz picture
How to charge AirPods to keep battery healthy?
Is Cinnamon a desktop environment or a window manager? (Or both?)
Is an up-to-date browser secure on an out-of-date OS?
Why isn't the circumferential light around the M87 black hole's event horizon symmetric?
Is it ok to offer lower paid work as a trial period before negotiating for a full-time job?
What does もの mean in this sentence?
Kerning for subscripts of sigma?
Running out of memory with MikTex
The 2019 Stack Overflow Developer Survey Results Are InDisplaying memory variable valuesmatlab2tikz - TeX out of memoryCan TeX generate “heap dumps” to analyze+optimize out-of-memory situations?Running out of main memory when using loops in TikZForest with ~500 nodes runs out of TeX memoryForest causing PdfLatex to run out of memoryHow can I avoid running out of memory when creating long, multipage tcolorbox's?LuaLaTeX runs out of memoryHow to increase memory size for xelatex in MikTeXMikTex Console already running
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
bumped to the homepage by Community♦ 9 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
bumped to the homepage by Community♦ 9 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 '18 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 '18 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 '18 at 2:00
add a comment |
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
miktex tikz-external memory
asked Sep 7 '18 at 18:17
jerHjerH
440413
440413
bumped to the homepage by Community♦ 9 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
bumped to the homepage by Community♦ 9 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 '18 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 '18 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 '18 at 2:00
add a comment |
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 '18 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 '18 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 '18 at 2:00
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 '18 at 15:37
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 '18 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 '18 at 1:28
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 '18 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 '18 at 2:00
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 '18 at 2:00
add a comment |
2 Answers
2
active
oldest
votes
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
add a comment |
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "85"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2ftex.stackexchange.com%2fquestions%2f449892%2frunning-out-of-memory-with-miktex%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
add a comment |
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
add a comment |
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
answered Sep 19 '18 at 20:59
jerHjerH
440413
440413
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
add a comment |
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 '18 at 1:03
add a comment |
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
add a comment |
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
add a comment |
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
edited Nov 13 '18 at 2:30
answered Nov 13 '18 at 2:23
KJOKJO
3,7241222
3,7241222
add a comment |
add a comment |
Thanks for contributing an answer to TeX - LaTeX Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2ftex.stackexchange.com%2fquestions%2f449892%2frunning-out-of-memory-with-miktex%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 '18 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 '18 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 '18 at 2:00