this code doesn't allow me to use it
position_loc = glGetAttribLocation(shader, "position")
color_loc = glGetAttribLocation(shader, "color")
the full code near it
shader = OpenGL.GL.shaders.compileProgram(OpenGL.GL.shaders.compileShader(self.vertex_shader_source, GL_VERTEX_SHADER),
OpenGL.GL.shaders.compileShader(self.fragment_shader_source, GL_FRAGMENT_SHADER))
position_loc = glGetAttribLocation(shader, "position")
color_loc = glGetAttribLocation(shader, "color")
glUseProgram(shader)
vbo = GLuint(0)
glGenBuffers(1, vbo)
glBindBuffer(GL_ARRAY_BUFFER, vbo)
glBufferData(GL_ARRAY_BUFFER, 72, (GLfloat * len(self.triangle))(* self.triangle), GL_STATIC_DRAW)
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 24, ctypes.c_void_p(0))
glEnableVertexAttribArray(0)
glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, 24, ctypes.c_void_p(12))
glEnableVertexAttribArray(1)
this is all the code aside from the shaders, which is in another question of mine. it works fine if i don't get the locations, but i was told i need to. if you could help me understand why it isn't working that would be awesome
The name argument needs to be a Bytes literal (bytesprefix b) rather than a string:
position_loc = glGetAttribLocation(shader, 'position')
color_loc = glGetAttribLocation(shader, 'color')
position_loc = glGetAttribLocation(shader, b'position')
color_loc = glGetAttribLocation(shader, b'color')
Related
I am trying to render a gltf model with ash (vulkan) in rust.
I sent all my data to the gpu and I am seeing this:
Naturally my suspicion is that the normal data is wrong. So I checked with renderdoc:
Those seem ok, maybe the attributes are wrong?
All those normals seem like they add to 1, should be fine. Maybe my pipeline is wrong?
Seems like the correct format and binding (I am sending 3 buffers and binding one to binding 0, one to 1 and the last to 2, binding 1 has the normals).
The only thing I find that is weird is, if I go to the vertex input pipeline stage see the buffers:
This is what the buffer at index 1 shows:
This does not happen for the buffer at index 0 (positions) which also render properly. So Whatever is causing the normals to show up as hex codes here is likely the cause of the bug. But i have no idea why this is happening. As far as I can see the pipeline and buffer were all set properly.
You presumably want to use one separate buffer for each vertex attribute (aka non-interleaved vertex buffer, SoA),
but your VkVertexInputAttributeDescription::offset values [0, 12, 24] is what you would use for one vertex buffer interleaving all attributes (provided that their binding values point to one and the same VkVertexInputBindingDescription).
e.g.
// Interleaved:
// Buffer 0: |Position: R32G32B32_FLOAT, Normal: R32G32B32_FLOAT, Uv: R32G32B32_FLOAT|, * vertex count
VkVertexInputBindingDescription {
.binding = 0,
.stride = 12 * 3, // 3 `R32G32B32_FLOAT`s !
.inputRate = VK_VERTEX_INPUT_RATE_VERTEX
};
// All attributes in the same `binding` == `0`
VkVertexInputAttributeDescription[3] {
{
.location = 0,
.binding = 0,
.format = VK_FORMAT_R32G32G32_SFLOAT,
.offset = 0 // [0, 11] portion
},
{
.location = 1,
.binding = 0,
.format = VK_FORMAT_R32G32G32_SFLOAT,
.offset = 12 // [12, 23] portion
},
{
.location = 2,
.binding = 0,
.format = VK_FORMAT_R32G32G32_SFLOAT,
.offset = 24 // [24, 35] portion
}
};
Your VkVertexInputBindingDescription[1].stride == 12 tells Vulkan that your vertex buffer 1 uses 12 bytes for each vertex, and your VkVertexInputAttributeDescription[1].offset == 12 says the normal value is at offset 12, which is out of bounds.
Same deal with your VkVertexInputAttributeDescription[2].offset == 24 overstepping (by a large amount) VkVertexInputBindingDescription[2].stride == 12.
For using one tightly-packed buffer for each vertex attribute, you need to correctly set your VkVertexInputAttributeDescription[n].offset values to 0, which looks something like:
// Non-interleaved:
// Buffer 0: |Position: R32G32B32_FLOAT|, * vertex count
// Buffer 1: |Normal: R32G32B32_FLOAT|, * vertex count
// Buffer 2: |Uv: R32G32B32_FLOAT|, * vertex count
VkVertexInputBindingDescription[3] {
{
.binding = 0,
.stride = 12,
.inputRate = VK_VERTEX_INPUT_RATE_VERTEX
},
{
.binding = 1,
.stride = 12,
.inputRate = VK_VERTEX_INPUT_RATE_VERTEX
},
{
.binding = 2,
.stride = 12,
.inputRate = VK_VERTEX_INPUT_RATE_VERTEX
}
};
// Each attribute in its own `binding` == `location`
VkVertexInputAttributeDescription[3] {
{
.location = 0,
.binding = 0,
.format = VK_FORMAT_R32G32G32_SFLOAT,
.offset = 0 // Whole [0, 11]
},
{
.location = 1,
.binding = 1,
.format = VK_FORMAT_R32G32G32_SFLOAT,
.offset = 0 // Whole [0, 11]
},
{
.location = 2,
.binding = 2,
.format = VK_FORMAT_R32G32G32_SFLOAT,
.offset = 0 // Whole [0, 11]
}
};
Worth noting is the comment line // vertex stride 12 less than total data fetched 24 generated by RenderDoc in the Buffer Format section, and how it does so.
It detects when your vertex attribute description oversteps its binding description's stride:
if(i + 1 == attrs.size())
{
// for the last attribute, ensure the total size doesn't overlap stride
if(attrs[i].byteOffset + cursz > stride && stride > 0)
return tr("// vertex stride %1 less than total data fetched %2")
.arg(stride)
.arg(attrs[i].byteOffset + cursz);
}
I want to display the subscript in the labels in the bar plot. Labels are the keys from the dictionary data in the following. I know how to use latex to do so, but I need to display it as it is from the keys in the dictionary. When I use the following script, it just displays the empty box, instead of the subscript.
import numpy as np
data = {'CO₆': 15,
'DO₄': 144,
'EO₈': 3,
'FaO₉': 1,
'GO₅': 7,
'Ha₆': 5}
f, ax = plt.subplots(figsize = (40, 4))
bin = np.arange(len(data.keys()))
ax.bar(data.keys(), data.values(), color='brown', align = "center", width = 0.3);
plt.xticks(rotation='vertical');
ax.xaxis.set_tick_params(labelsize = 32);
ax.yaxis.set_tick_params(labelsize = 32);
plt.xlim(-0.5, bin.size-0.5);
The font that you are using must not have those unicode characters.
Try changing the font, this one works for me:
plt.rcParams['font.sans-serif'] = ['DejaVu Sans']
To use a Serif font:
plt.rcParams['font.family'] = 'serif'
plt.rcParams['font.serif'] = ['DejaVu Serif']
The used font: https://www.fontspace.com/abeezee-font-f30774
ImageFont.truetype(*path/to/font*, 300)
font.getsize("1\n\r0\n\r9") # returns: (1080, 280) which is wrong!
image = np.full(shape=(1, 1, 3), fill_value=0, dtype=np.uint8)
image = Image.fromarray(image, mode="RGB")
draw = ImageDraw.Draw(image)
draw.multiline_textsize(text="1\n\r0\n\r9", font=font, spacing=0) # returns: (180, 837) which is correct"
Why are the results different? What am I missing?
So The main error was:
1) for multiline text we should use:
PIL.ImageDraw.ImageDraw.multiline_text(xy, text, fill=None, font=None, anchor=None, spacing=0, align="left", direction=None, features=None, language=None)
In addition .getsize() returned a height that is a little too big. The height that worked for me was:
font.getmask(digit).size[1]
wich is the same to:
font.getsize(digit)[1] - font.getoffset(digit)[1]
I've generated a line chart using Apache poi. There are 400 values in the X axis and the tick marks make some values unclear as there is lot of tick marks there. Therefore, I need to remove the tick marks in X axis. Is there any way to remove them?
My code is as follows.
Drawing drawing = sheet4.createDrawingPatriarch();
ClientAnchor anchor = drawing.createAnchor(0, 0, 0, 0, 1, 1, 17, 22);
Chart chart = drawing.createChart(anchor);
ChartLegend legend = chart.getOrCreateLegend();
legend.setPosition(LegendPosition.RIGHT);
LineChartData data = chart.getChartDataFactory().createLineChartData();
ChartAxis bottomAxis = chart.getChartAxisFactory().createCategoryAxis(AxisPosition.BOTTOM);
ValueAxis leftAxis = chart.getChartAxisFactory().createValueAxis(AxisPosition.LEFT);
leftAxis.setCrosses(AxisCrosses.AUTO_ZERO);
ChartDataSource<Number> xs = DataSources.fromNumericCellRange(sheet1, new CellRangeAddress(1, 380, 0, 0));
ChartDataSource<Number> ys1 = DataSources.fromNumericCellRange(sheet1, new CellRangeAddress(1, 380, 1, 1));
ChartDataSource<Number> ys2 = DataSources.fromNumericCellRange(sheet1, new CellRangeAddress(1, 380, 3, 3));
ChartDataSource<Number> ys3 = DataSources.fromNumericCellRange(sheet1, new CellRangeAddress(1, 380, 4, 4));
ChartDataSource<Number> ys4 = DataSources.fromNumericCellRange(sheet1, new CellRangeAddress(1, 380, 8, 8));
LineChartSeries series1 = data.addSeries(xs, ys1);
series1.setTitle("Value 1");
LineChartSeries series2 = data.addSeries(xs, ys2);
series2.setTitle("Value 2");
LineChartSeries series3 = data.addSeries(xs, ys3);
series3.setTitle("Value 3");
LineChartSeries series4 = data.addSeries(xs, ys4);
series4.setTitle("Value 4");
chart.plot(data, bottomAxis, leftAxis);
XSSFChart xssfChart = (XSSFChart) chart;
CTPlotArea plotArea = xssfChart.getCTChart().getPlotArea();
plotArea.getLineChartArray()[0].getSmooth();
CTBoolean ctBool = CTBoolean.Factory.newInstance();
ctBool.setVal(false);
plotArea.getLineChartArray()[0].setSmooth(ctBool);
for (CTLineSer ser : plotArea.getLineChartArray()[0].getSerArray()) {
ser.setSmooth(ctBool);
}
For newer POI versions , we could remove the marker style using:
series1.setMarkerStyle(MarkerStyle.NONE);
Additional Info:
In addition to that it support different marker styles:
MarkerStyle.CIRCLE
MarkerStyle.DASH
MarkerStyle.DIAMOND
MarkerStyle.DOT
MarkerStyle.PICTURE
MarkerStyle.PLUS
MarkerStyle.SQUARE
MarkerStyle.STAR
MarkerStyle.TRIANGLE
I'm using the library version : 4.1.2.
I am trying to set the name of these WTForm fields without having to change the name of the variables themselves. Any help is appreciated.
class AddProblemForm(Form):
problem_point_value_data = [[10, 20, 30], [40, 50, 60], [70, 80, 90, 100], [120, 140, 160, 180, 200]]
problemName = StringField('problemName',_name="problemName", validators=[validators.DataRequired(), validators.Length(min=2, max=64)])
problem_description = PageDownField('problemDescription', validators=[validators.DataRequired(), validators.Length(min=10, max=512)])
flag = StringField('flag', validators=[validators.DataRequired(), validators.Length(min=5, max=64)])
problem_difficulty = SelectField('problemDifficulty', choices=[(x, x) for x in range(4)], coerce=int)
problem_points = SelectField('problemPoints', choices=[(x, diff) for x,diff in zip(range(len(app.config['POINTS'])), app.config['POINTS'])], coerce=int)
problem_category = SelectField('problemCategory', choices=[(x, cat) for cat,x in zip(app.config['CATEGORIES'], range(len(app.config['CATEGORIES'])))], coerce=int)
problem_solution = TextAreaField('problemSolution', validators=[validators.DataRequired()])
The problem here (and yes that pun is absolutely intended!) is that you have defined a field with problemName = StringField(). There is nothing else that this field is called other than 'problemName'.
The text label is a different matter as Moses has commented, your HTML can display anything.