Using shaders to compute lighting?

Questions about the LÖVE API, installing LÖVE and other support related questions go here.
Forum rules
Before you make a thread asking for help, read this.
Whatthefuck
Party member
Posts: 106
Joined: Sat Jun 21, 2014 3:45 pm

Using shaders to compute lighting?

Post by Whatthefuck »

Hey everyone.

I'm working on a game and I had a question about lighting - is it possible to compute it on the GPU using shaders?
Right now the way it works is it builds a sprite batch and performs a draw call.
It works great and is very fast (mainly because I'm obsessed with optimization and always go to great lengths to find better ways of doing the same task much more efficiently), and even though computing the lighting on the CPU is very fast, I still want to know whether it's possible to do it on the GPU.

Here's what it looks like at the moment:

Image

Thanks in advance.
User avatar
Zilarrezko
Party member
Posts: 345
Joined: Mon Dec 10, 2012 5:50 am
Location: Oregon

Re: Using shaders to compute lighting?

Post by Zilarrezko »

Yes, Löve has a way to do shaders. It uses a form of GLSL 120 but varies a bit. Specifically some variable names and has a function for each type of shader(fragment and vertex).

I however am having trouble understanding it... somewhat. More like I'm trying to beat a dead horse to tell me how to by pass Löve's difference in normals and "per-vertex" and the like. I was trying to do something very similar to what's in that picture. I believe it was per-fragment lighting.

There's a good tutorial series that got me starting to learn about shaders. Sadly A: He never really finished it. and B: It doesn't teach you how to by pass Löve differences in GLSL.

You can find the shader variables here. But since that never really helped me learn their equivalence to GLSL variables I used this.

I pretty much have no real luck with it, and getting people to help you seems to be less successful than asking for help on a subject without someone saying "there's a library for that".

As far as what you're saying is it possible to do it on the GPU? That's what these shaders do.

Sorry I can't be more helpful but it seems shaders are a difficult thing to get into. At least for me and my ability to understand what slime says [urlhttps://love2d.org/forums/viewtopic.php?f=4&t=78250]here[/url]. But yeah, Löve shaders, nice little forum post of everyone showing off their skill here that you've probably already seen.
User avatar
Ranguna259
Party member
Posts: 911
Joined: Tue Jun 18, 2013 10:58 pm
Location: I'm right next to you

Re: Using shaders to compute lighting?

Post by Ranguna259 »

Check out this light and shadow lib if you don't feel like reinventing the wheel :P

EDIT: But I don't think that this lib works like the light effect shown in the picture you posted, I recomend looking into the links that zilarrezko posted, it shouldn't be hard to produce the effect you want.
LoveDebug- A library that will help you debug your game with an on-screen fully interactive lua console, you can even do code hotswapping :D

Check out my twitter.
Whatthefuck
Party member
Posts: 106
Joined: Sat Jun 21, 2014 3:45 pm

Re: Using shaders to compute lighting?

Post by Whatthefuck »

Ranguna259 wrote:Check out this light and shadow lib if you don't feel like reinventing the wheel :P

EDIT: But I don't think that this lib works like the light effect shown in the picture you posted, I recomend looking into the links that zilarrezko posted, it shouldn't be hard to produce the effect you want.
Did you guys even read what I posted in the OP? The image in the OP is the lighting I have in the game right now and I'm asking whether it's possible to compute the lighting on the GPU.
User avatar
Ranguna259
Party member
Posts: 911
Joined: Tue Jun 18, 2013 10:58 pm
Location: I'm right next to you

Re: Using shaders to compute lighting?

Post by Ranguna259 »

...it shouldn't be hard to produce the effect you want in GLSL.
Sorry I forgot about those two words, more words: you can, use [wiki]Shader[/wiki]s.
LoveDebug- A library that will help you debug your game with an on-screen fully interactive lua console, you can even do code hotswapping :D

Check out my twitter.
User avatar
Zilarrezko
Party member
Posts: 345
Joined: Mon Dec 10, 2012 5:50 am
Location: Oregon

Re: Using shaders to compute lighting?

Post by Zilarrezko »

Alright, sorry you said something about sprite batch or something and that got me confused (sad right?).

shaders already use the GPU when computing. If they used CPU's then everything would be fairly slow because CPU's aren't really made for video processing for stuff like shading.
Whatthefuck
Party member
Posts: 106
Joined: Sat Jun 21, 2014 3:45 pm

Re: Using shaders to compute lighting?

Post by Whatthefuck »

Ranguna259 wrote:
...it shouldn't be hard to produce the effect you want in GLSL.
Sorry I forgot about those two words, more words: you can, use [wiki]Shader[/wiki]s.
That's all I needed to know. Code-wise, would it be pretty much the same way I do it on the CPU right now, which is check for 9 nearby blocks and then calculate the average brightness value?
User avatar
Ranguna259
Party member
Posts: 911
Joined: Tue Jun 18, 2013 10:58 pm
Location: I'm right next to you

Re: Using shaders to compute lighting?

Post by Ranguna259 »

There are no "blocks" in shaders, when you program in GLSL you are essentially calculating every pixel on the screen, check Zilarrezko post to get started, shaders is no easy thing so be prepared :P
LoveDebug- A library that will help you debug your game with an on-screen fully interactive lua console, you can even do code hotswapping :D

Check out my twitter.
Whatthefuck
Party member
Posts: 106
Joined: Sat Jun 21, 2014 3:45 pm

Re: Using shaders to compute lighting?

Post by Whatthefuck »

Ranguna259 wrote:There are no "blocks" in shaders, when you program in GLSL you are essentially calculating every pixel on the screen, check Zilarrezko post to get started, shaders is no easy thing so be prepared :P
I know, however it's possible to send tables to the GPU and do the same calculations there if I'm not mistaken, right?

Also, how expensive would it be to send a table containing 256 indexes?
Basically what I'm wondering is whether it's possible to do something in the vein of GPGPU in love2D.
User avatar
Ranguna259
Party member
Posts: 911
Joined: Tue Jun 18, 2013 10:58 pm
Location: I'm right next to you

Re: Using shaders to compute lighting?

Post by Ranguna259 »

I dont' think that GLSL accepts tables that have more than 4 indexes (vec4)
LoveDebug- A library that will help you debug your game with an on-screen fully interactive lua console, you can even do code hotswapping :D

Check out my twitter.
Post Reply

Who is online

Users browsing this forum: Ahrefs [Bot], Bing [Bot] and 27 guests