WebGL-wind and GPU programming. Lecture on FrontTalks 2018

 3r33891. 3r3-31. To draw complex graphics on web pages, there is a Web Graphics Library, abbreviated WebGL. Interface developer Dmitry Vasilyev spoke about programming the GPU from the point of view of the layout designer, what WebGL is and how this technology solved the problem of visualizing large weather data. 3r33876.  3r33891. 3r33876.  3r33891.
3r3887. 3r3887. 3r3887. 3r33876.  3r33891. - I develop interfaces at the Ekaterinburg office of Yandex. I started in the group Sport. We were developing sports special projects, when there were the World Championships in hockey, football, the Olympics, the Paralympics and other cool events. I also worked on the development of a special search results, which was dedicated to the new Sochi highway. 3r33876.  3r33891. 3r33876.  3r33891. Link from slide [/sup] [/sub] 3r3808. 3r33876.  3r33891. We looked at the reviews of people. People liked it. All kinds of memesics began to appear, and there were cool pictures when Moscow was flooded. 3r33876.  3r33891. 3r33876.  3r33891. Since everyone liked the format, we decided to move on and devote the next forecast to the wind. 3r33876.  3r33891. 3r33876.  3r33891. 3r3118. 3r33876.  3r33891. 3r33876.  3r33891. Services that show the forecast of the wind, is already there. This is a pair of cool, particularly prominent. 3r33876.  3r33891. 3r33876.  3r33891. WebGL-wind and GPU programming. Lecture on FrontTalks 2018 3r33876.  3r33891. 3r33876.  3r33891. If we think that we still need to move the particles and impose some kind of beauty like drawing the tail of the particles, we can assume that we will fall out for a minimum timeout of 40 ms to show a smooth animation to produce at least 25 frames per second. 3r33876.  3r33891. 3r33876.  3r33891. The problem is that here each particle would be processed sequentially. But what if you process them in parallel? 3r33876.  3r33891. 3r33876.  3r33891. A visual difference in the work of the central and graphic processor was shown in the “Destroyers of Legends” at a conference. They rolled out a typewriter on which a paintball marker was installed, the task of which was to draw a smiley in one color. In about 10 seconds, he drew such a picture. (3r33170. Video link 3r33879. - Ed.) 3r33876.  3r33891. 3r33876.  3r33891. 3r3176. 3r33876.  3r33891. 3r33876.  3r33891. 3r3181. 3r33876.  3r33891. 3r33876.  3r33891. 3r3186. 3r33876.  3r33891. 3r33876.  3r33891. Then the guys rolled out the canoe, which is a GPU, and a couple of spits drew Mona Lisa. Approximately the speed of computing CPU and GPU is different. 3r33876.  3r33891. 3r33876.  3r33891. 3r3195. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. In order to take advantage of such features in the browser, WebGL technology was invented. 3r33876.  3r33891. 3r33876.  3r33891. What it is? With this question I got into the Internet. Having added a couple of words with particle animation and wind, I found a couple of articles. 3r33876.  3r33891. 3r33876.  3r33891. Links from the slide: The first 3r33879. , 3r3802. the second [/sub] 3r3808. 3r33876.  3r33891. One of them is a demo of Vladimir Agafonkin, an engineer from Mapbox, who did exactly the wind on WebGL and referred to Chris Wellas's blog, who talked about how to move and store particles on the GPU. 3r33876.  3r33891. 3r33876.  3r33891. We take and copy. We expect this result. Here the particles move smoothly. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. We are trying to deal with the code. We improve, again we get an unsatisfactory result. We climb even deeper - we get rain instead of wind. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. To work with WebGL, there are frameworks. Almost all of them are aimed at working with 3D-objects. We do not need these 3D features. We only need to draw a particle and move it. Therefore, we decide to do everything by hand. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. Well, we decide to use the old proven WebGL ? which has good support, except Opera Mini, which no one needs. 3r33876.  3r33891. 3r33876.  3r33891. 3r3305. 3r33876.  3r33891. 3r33876.  3r33891. WebGL is a two-piece thing. This is JS, which executes the state of the programs that run on the graphics card. And there are components that run right on the graphics card. 3r33876.  3r33891. 3r33876.  3r33891. Let's start with JS. WebGL is just the appropriate context for the canvas element. Moreover, upon receiving this context, it is not just a specific object that is allocated, iron resources are allocated. And if we run something beautiful on WebGL in the browser, and then decide to play Quake, then it is quite possible that these resources will be lost, and the context may be lost, and your entire program will break. 3r33876.  3r33891. 3r33876.  3r33891. 3r33333. 3r33876.  3r33891. 3r33876.  3r33891. Therefore, when working with WebGL, you also need to listen to the loss of context and be able to restore it. Therefore, I stressed that init is. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. In the code, it looks like this. In the shader there is a section for declaring variables that are set outside, from JS, their type and name are determined. And also the main section, which executes the code needed for this iteration. 3r33876.  3r33891. 3r33876.  3r33891. The vertex shader in most cases is expected to set the gl_Position variable to some coordinate in four-dimensional space. This is x, y, z and the width of the space, which is not really necessary to know at the moment. 3r33876.  3r33891. 3r33876.  3r33891. The fragment shader waits for the color of a particular pixel to be set. 3r33876.  3r33891. 3r33876.  3r33891. In this example, our pixel color is selected from the attached texture. 3r33876.  3r33891. 3r33876.  3r33891. 3r33333. 3r33876.  3r33891. 3r33876.  3r33891. To transfer this to JS, it is enough to wrap the source code of the shaders in variables. 3r33876.  3r33891. 3r33876.  3r33891.  3r33891. 3r33876.  3r33891. 3r33424. 3r33876.  3r33891. 3r33876.  3r33891. The fragment shader operates in the space from 0 to ? and the colors here are given out by four components: R, G, B, Alpha. 3r33876.  3r33891. 3r33876.  3r33891. On the example of CSS, you might encounter a similar color record if you use percentages. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. Go to JS. We assign the same shaders to variables and declare the function that will create these shaders. That is, a shader is created, the source is poured into it, and then compiled. 3r33876.  3r33891. 3r33876.  3r33891. 3r33434. 3r33876.  3r33891. 3r33876.  3r33891. Make two shaders, vertex and fragment. 3r33876.  3r33891. 3r33876.  3r33891. 3r33477. 3r33876.  3r33891. 3r33876.  3r33891. After that, we create a program, upload already compiled shaders. We link the program, because shaders can exchange variables between each other. And at this stage the correspondence of types of variablesexchanged by these shaders. 3r33876.  3r33891. 3r33876.  3r33891. We say that use this program. 3r33876.  3r33891. 3r33876.  3r33891. 3r33490. 3r33876.  3r33891. 3r33876.  3r33891. Next we create a list of vertices that we want to visualize. WebGL has an interesting feature for some variables. To change a specific data type, you need to set a global context for editing array_buffer, and then upload something to this address. There is no explicit assignment of any data to a variable. Here everything is done through the inclusion of some context. 3r33876.  3r33891. 3r33876.  3r33891. You must also set the rules for reading from this buffer. It can be seen that we specified an array of six elements, but the program needs to be explained that each vertex consists of two components whose type is float, this is done in the last line. 3r33876.  3r33891. 3r33876.  3r33891. 3r33535. Link from slide
[/sub] 3r3808. 3r33876.  3r33891. If you slightly complicate the example, you can add color dependence on the position of the cursor. At the same time fps rolls over. 3r33876.  3r33891. 3r33876.  3r33891. 3r33538. 3r33876.  3r33891. 3r33876.  3r33891. To draw particles all over the world, you need to know the wind speed at every point of this world. 3r33876.  3r33891. 3r33876.  3r33891. To increase and somehow move the map, you need to create containers that correspond to the current position of the map. 3r33876.  3r33891. 3r33876.  3r33891. To move the particles themselves, you need to come up with a data format that could be updated using a graphics processor. Make drawing and loop drawing itself. 3r33876.  3r33891. 3r33876.  3r33891. 3r33555. 3r33876.  3r33891. 3r33876.  3r33891. We make all the data through the texture. We use 22 channels to determine the horizontal and vertical speeds, where zero wind speed corresponds to the middle of the color range. This is about 128. Since the speed can be negative and positive, we set the color relative to the middle of the range. 3r33876.  3r33891. 3r33876.  3r33891. It turns out this picture. 3r33876.  3r33891. 3r33876.  3r33891. 3r3-3568. 3r33876.  3r33891. 3r33876.  3r33891. To load it on the map, we need to cut it. To connect the image to the map, we use the standard Yandex.Map Layer tool, in which we define the address from which to get the cut tiles, and add this layer to the map. 3r33876.  3r33891. 3r33876.  3r33891. 3r33577. 3r33876.  3r33891. 3r33795. 3r?383. Link from slide
3r3808. 3r33876.  3r33891. We get a picture where the unpleasant green color is the coded wind speed. 3r33876.  3r33891. 3r33876.  3r33891. 3r? 3594. 3r33876.  3r33891. 3r33876.  3r33891. Next you need to get a place in which we will draw the animation itself, while this place must correspond to the coordinates of the map, its movements and other actions. 3r33876.  3r33891. 3r33876.  3r33891. By default, we can assume that we would use the Layer, but the card Layer creates a canvas, from which it immediately captures the 2D context, which it turns out to capture. But if we try to take from the canvas, which already has a context of another type, and take the GL context from it, we will end up with null. If you turn to it, the program crashes. 3r33876.  3r33891. 3r33876.  3r33891. 3r33613. Link from slide
[/sub] 3r3808. 3r33876.  3r33891. Therefore, we used Pane, these are containers for layers, and added our canvas there, from which we have already taken the context we need. 3r33876.  3r33891. 3r33876.  3r33891. 3r32424. 3r33876.  3r33891. 3r33876.  3r33891. In order to somehow place the particles on the screen and be able to move them, the format of the position of the particles in the texture was used. 3r33876.  3r33891. 3r33876.  3r33891. How it works? A square texture is created for optimization, and the size of its side is known. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r3651. 3r33876.  3r33891. 3r33876.  3r33891. In the shader itself, it looks like reading a rendered index, texture with the current position of the particles and the size of the side. Next, we determine the x, y coordinates for this particle, read this value, and decode it. What is this magic: rg /255 + ba? 3r33876.  3r33891. 3r33876.  3r33891. For the position of the particles we use 20 double channels. The color channel has a value from 0 to 25? and for a 1080 screen for some we cannot put particles at any position on the screen, because we can set a maximum of a particle at 255 pixels. Therefore, in one channel we store the knowledge of how many times a particle has passed 255 pixels, and in the second channel we store the exact value of how much it has passed after. 3r33876.  3r33891. 3r33876.  3r33891. Next, the vertex shader must convert these values ​​to its work space, that is, from -1 to ? and set this point on the display. 3r33876.  3r33891. 3r33876.  3r33891. 3r3668. 3r33876.  3r33891. 3r33876.  3r33891. To just look at our particles, it is enough to paint them in white. There is such a sugar in GLSL that if we define the type of a variable and pass it into a constant, then this constant will be distributed over all four components, for example. 3r33876.  3r33891. 3r33876.  3r33891. 3r3777. 3r33876.  3r33891. 3r33876.  3r33891. Having drawn such a program, we see a set of identical squares. Let's try to add them beauty. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. Calculate the distance to draw the pixel from the center. If it exceeds half the space, then we do not show it. 3r33876.  3r33891. 3r33876.  3r33891. 3r33737. 3r33876.  3r33891. 3r33876.  3r33891. We get a more diverse picture. 3r33876.  3r33891. 3r33876.  3r33891. Next you need to somehow move these things. Since WebGL 1 does not know how to calculate something, work directly with data, we will use the possibilities of drawing programs into special components, frame buffers. 3r33876.  3r33891. 3r33876.  3r33891. Frame buffers can bind, for example, textures that can be updated. If the frame buffer is not declared, then the default drawing is performed on the screen. 3r33876.  3r33891. 3r33876.  3r33891. By switching the output from one position texture to another, we can update them one by one and then use them for drawing. 3r33876.  3r33891. 3r33876.  3r33891. 3r3734. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. 3r33876.  3r33891. In this example, we calculate the predicted position of the particle, the random position, and choose either one or the other, depending on the reset factor. 3r33876.  3r33891. 3r33876.  3r33891. Links from the slide: The first 3r33879. , 3r30000. the second , 3r3802. third 3r37979. , 3r3804. fourth 3r37979. [/sub] 3r3808. 3r33876.  3r33891. To understand what was on the last slide, you can read these articles. The first provides a huge boost in understanding what WebGL gives, what it consists of and how not to err in it. At Khronos, this is the group that develops the standard, there is a description of all the functions. 3r33876.  3r33891. 3r33876.  3r33891. 3r33815. 3r33876.  3r33891. 3r33876.  3r33891. The last point of our task is to draw a trace from the particles. To do this, we will, like with the update position textures, write the current position on the screen in two textures, and display the current position on the screen, slightly increasing its transparency, overlay the new position of the particles, then again increase the transparency of this image and more impose a new position on top. 3r33876.  3r33891. 3r33876.  3r33891. 3r33824. 3r33876.  3r33891. 3r33876.  3r33891. We get this animation plume. 3r33876.  3r33891. 3r33876.  3r33891. 3r33338. 3r33876.  3r33891. 3r33876.  3r33891. 3r33838. 3r33876.  3r33891. 3r33876.  3r33891. If you compare the full cycle of drawing WebGL with the output of some points on the screen using 2D canvas, you can see a big gap in speed. To draw 64 thousand points on 2D canvas, it takes an average of 25 ms, while WebGL blocks the main stream for 0.3 ms. This is a difference of a hundred times. 3r33876.  3r33891. 3r33876.  3r33891. Thus, the use of WebGL allows you to less block the flow of the foundation, and this is especially important when working with the card when the high responsiveness of the card itself is important. 3r33876.  3r33891. 3r33876.  3r33891. Initially, all developers are probably used to using the browser console to put some break points, console logs and see what's going on inside. WebGL is a black box. 3r33876.  3r33891. 3r33876.  3r33891. 3r33855. 3r33876.  3r33891. 3r33876.  3r33891. But there are some tools that allow you to work with him. For example, Firefox has a built-in “shaders” tab, where on the fly you can find WebGL banners on the screen, extract programs from them, extract shaders from them, and change values ​​on the fly. For example, here on the fly the color of dots from white turns to blue. 3r33876.  3r33891. 3r33876.  3r33891. 3r33838. 3r33876.  3r33891. 3r33876.  3r33891. The second tool that makes life a lot easier is the Spector.js browser extension. It also captures the canvas from the WebGL context and allows you to see all the operations performed on this canvas, timings, and passed variables. 3r33876.  3r33891. 3r33876.  3r33891. 3r33873. 3r33876.  3r33891. 3r33876.  3r33891. In total for the week of work, we got 3r-33878 from scratch. ready solution
downwind I hope I managed to tell you what kind of technology it is, WebGL, what it consists of, and give a real example of its use in sales. That's all. 3r3887. 3r33891. 3r33891. 3r33891. 3r3884. ! function (e) {function t (t, n) {if (! (n in e)) {for (var r, a = e.document, i = a.scripts, o = i.length; o-- ;) if (-1! == i[o].src.indexOf (t)) {r = i[o]; break} if (! r) {r = a.createElement ("script"), r.type = "text /jаvascript", r.async =! ? r.defer =! ? r.src = t, r.charset = "UTF-8"; var d = function () {var e = a.getElementsByTagName ("script")[0]; e. ): d ()}}} t ("//mediator.mail.ru/script/2820404/"""_mediator") () (); 3r3885. 3r33891. 3r3887. 3r33891. 3r33891. 3r33891. 3r33891.
+ 0 -

Add comment