It is currently Mon, 18-12-17, 1:21 GMT

All times are UTC




Post new topic Reply to topic  [ 18 posts ]  Go to page 1, 2  Next
Author Message
 Post subject: Requesting New Features
PostPosted: Tue, 04-09-07, 18:38 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
Yes, please, which further features do we need??

Bye Fridger

_________________
Image


Top
 Profile  
 
 Post subject:
PostPosted: Tue, 25-09-07, 20:53 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
A number of scientific data sets for planetary textures are released in form of smaller data tiles (e.g. Mars,...). It seems to me that an important extension of the present F-TexTools set could be a flexible tool for merging the tiles together into ONE texture.

PLease let me know your ideas and feature requests!

Thanks,
Bye Fridger

_________________
Image


Top
 Profile  
 
 Post subject:
PostPosted: Tue, 25-09-07, 21:18 GMT 
Offline
Site Admin
User avatar

Joined: Thu, 30-08-07, 22:52 GMT
Posts: 2726
Location: France, South, not far from Montpellier
t00fri wrote:
A number of scientific data sets for planetary textures are released in form of smaller data tiles (e.g. Mars,...). It seems to me that an important extension of the present F-TexTools set could be a flexible tool for merging the tiles together into ONE texture...


That would be gorgeous!

Also, as pointed by DW, in one year or so, we should try not to forget the new moon data from the Japanese mission...


Top
 Profile  
 
 Post subject:
PostPosted: Fri, 28-09-07, 2:14 GMT 
Offline
User avatar

Joined: Tue, 04-09-07, 2:32 GMT
Posts: 430
Location: South Korea
I've been having a hard time trying to find a suitable set of Mars HRSC tiles to download.
Somehow the official data is organized into "orbit numbers", which do not appear to map to geographical coordinates. I've also tried the HRSC online image viewer (which does provide coordinate-referenced data) but it doesn't seem to let me batch download say, all tiles.

So maybe an improvement for the texture tools might be say, a list of download urls listed in the documentation, or some sort of script that offers up a regularly updated library of urls (hosted on the CM server) so that the user can download image data more conveniently.


Top
 Profile  
 
 Post subject:
PostPosted: Sun, 30-09-07, 12:12 GMT 
Offline
User avatar

Joined: Tue, 04-09-07, 2:32 GMT
Posts: 430
Location: South Korea
This isn't exactly a feature request for F-TexTools or nmtools per se but..
Wouldn't it be nice if the nvidia texture tools (which convert png etc -> dds) also accepted stdin input so the output from F-TexTools could just be piped in? This way, we could have raw data -> pow2/half -> tiles -> dds all in one go. Sounds good, no?


Top
 Profile  
 
 Post subject:
PostPosted: Sun, 30-09-07, 12:33 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
dirkpitt wrote:
This isn't exactly a feature request for F-TexTools or nmtools per se but..
Wouldn't it be nice if the nvidia texture tools (which convert png etc -> dds) also accepted stdin input so the output from F-TexTools could just be piped in? This way, we could have raw data -> pow2/half -> tiles -> dds all in one go. Sounds good, no?


DW,

oh yes, it sounds good and I have been thinking all along about this option ;-) . From my view, there were essentially 2 significant 'CON' arguments besides the many 'PROS' at this time:

1) without Ignacio C. implementing the STDIN/STDOUT option himself, we would have to do it, but then loose the "of the shelf" advantages wrto the nvidia-tools. Given the rapid development of the nvidia-tools, this seemed to me a bit early. So perhaps we can just convince Ignacio ...

2) Even if Ignacio implements STDIN/STDOUT redirection, we still need to pass to nvcompress > 2048 times /automatically/ the different tile names tx_i_j.dds. This is again easy, if we implement the nvidia code into the F-TexTools and the NmTools in form of a compress library. But if not, this looks problematic without using extensive (OS-dependent) shell scripting.

Do you have a workaround for this?
But in any case, I am all for discussing the PROS and CONS further!

Cheers,
Fridger

_________________
Image


Top
 Profile  
 
 Post subject:
PostPosted: Mon, 01-10-07, 7:05 GMT 
Offline
User avatar

Joined: Tue, 04-09-07, 2:32 GMT
Posts: 430
Location: South Korea
t00fri wrote:
2) Even if Ignacio implements STDIN/STDOUT redirection, we still need to pass to nvcompress > 2048 times /automatically/ the different tile names tx_i_j.dds.


Very good point.. it might be possible though to run nvcompress from say, within nmtiles by exec'ing nvcompress and establishing a pipe, for each tile. While this would require nmtiles/F-TexTools to know the path of the nvcompress tool and command line options (and even those could be specified with say, an environment variable etc), it would not require compiling in the nv tools as a library and would not require any shell scripting. This method would work even without modifying nvcompress to accept stdin (nmtiles writes a temporary png, nvcompress converts to dds, temporary png deleted by nmtiles).


Top
 Profile  
 
 Post subject:
PostPosted: Mon, 01-10-07, 7:15 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
dirkpitt wrote:
t00fri wrote:
2) Even if Ignacio implements STDIN/STDOUT redirection, we still need to pass to nvcompress > 2048 times /automatically/ the different tile names tx_i_j.dds.


Very good point.. it might be possible though to run nvcompress from say, within nmtiles by exec'ing nvcompress and establishing a pipe, for each tile. While this would require nmtiles/F-TexTools to know the path of the nvcompress tool and command line options (and even those could be specified with say, an environment variable etc), it would not require compiling in the nv tools as a library and would not require any shell scripting. This method would work even without modifying nvcompress to accept stdin (nmtiles writes a temporary png, nvcompress converts to dds, temporary png deleted by nmtiles).


Of course, that could be the elegant solution...except that we got to realize this in a cross-platform manner. Let's think more about this option. I really like it.

Cheers,
Fridger

_________________
Image


Top
 Profile  
 
 Post subject:
PostPosted: Fri, 28-03-08, 9:19 GMT 
Offline
User avatar

Joined: Sat, 15-03-08, 12:41 GMT
Posts: 6
I was thinking it should be fairly easy to convert a bump map to a 16 (or 8 ) bit elevation map.
If you say that the average heigt of a map is gray value 128 which corresponds with an elevation of 0, it should be possible to calculate the corresponding values, where black (0) is -65536 and white is +65536.

After converting the bump map to a binary elevation map, you can convert this elevation map to a normal map with the nmtools.

This could be handy if you would like, for instance, convert an existing moon bump map or DEM to a normal map.


Top
 Profile  
 
 Post subject:
PostPosted: Fri, 28-03-08, 18:21 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
CAP-Team wrote:
I was thinking it should be fairly easy to convert a bump map to a 16 (or 8 ) bit elevation map.
If you say that the average heigt of a map is gray value 128 which corresponds with an elevation of 0, it should be possible to calculate the corresponding values, where black (0) is -65536 and white is +65536.

After converting the bump map to a binary elevation map, you can convert this elevation map to a normal map with the nmtools.

This could be handy if you would like, for instance, convert an existing moon bump map or DEM to a normal map.



Cap-Team,

your arguments are besides the point. The main challenge for normal maps is to be generated from VERY smooth elevation of bump maps. Only for "baby" resolutions like 2k (for which my tools are really a tremendous overkill), you could sensibly use 8bit grayscale bump or elevation maps as input. Of course I have a conversion tool, 'png2bin' (the inverse of bin2png), as part of my F-TexTools. So if you really don't believe me, use a grayscale .PNG elevation map, e.g. of Mars, convert it to 8bit bin format and calculate a normal map with my nmtools. But you will be disappointed, because of the noisyness.

You cannot cheat in the way you propose and make 16bit binary data from 8bit ones ;-) . The existing moon elevation maps (based on DATA) are ALL of pretty bad quality. So no matter what you do, they are NO fun. ;-)

DEM's you can print out without quality compromise into 16bit binary format with the help of ISIS3 tools. So they're fine.

F.


Top
 Profile  
 
 Post subject:
PostPosted: Tue, 01-04-08, 13:31 GMT 
Offline
User avatar

Joined: Sat, 15-03-08, 12:41 GMT
Posts: 6
I already tried this, but the NMS executable only reads 16-bit height maps, not 8-bit height maps.
I thought it was fun at least trying this.

Is there any way to force the nms tools to interpret a 8-bit bin file as such (and not as a 16 bit bin file)?


Top
 Profile  
 
 Post subject:
PostPosted: Tue, 01-04-08, 16:56 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
CAP-Team wrote:
I already tried this, but the NMS executable only reads 16-bit height maps, not 8-bit height maps.
I thought it was fun at least trying this.

Is there any way to force the nms tools to interpret a 8-bit bin file as such (and not as a 16 bit bin file)?


Of course, but this is definitely NOT a target for nmtools applications. The nmtools are exclusively designed for producing highest quality normalmaps from scientific raw data of very high resolution.

There must be many low quality tools out there that can handle what you are interested in.

F.


Top
 Profile  
 
 Post subject:
PostPosted: Sun, 06-04-08, 0:18 GMT 
Offline

Joined: Tue, 01-04-08, 23:47 GMT
Posts: 3
Some tools that would be useful:

* An application "pngsize" that returns the width of a PNG file as an error level (0 = error (too small, invalid aspect ratio or other error), 1=1024 .. 2047, 2=2048 .. 4095, etc). error level = int (log2 (width) - log2 (512))
* An application "png2pow2", which is like tx2pow2, except it works with png files. No need to specify width or height with this application because these are encoded in the PNG. It outputs the binary format.

If implemented, these will make it possible to write a universal script that can work with any size of PNG file instead of those that fall within a certain size range. At present, the weakness of the F-TexTools package is that the size of the source texture must be known in advance. If these are implemented, we can overcome this limitation. We can test the value of pngsize and then branch to the appropriate section of the script that deals with png files of that particular size.


Top
 Profile  
 
 Post subject:
PostPosted: Sun, 06-04-08, 9:57 GMT 
Offline
Site Admin
User avatar

Joined: Fri, 31-08-07, 7:01 GMT
Posts: 4514
Location: Hamburg, Germany
bdm,

sorry but the intention of the tools is not to come up with another alround set for general image manipulations. The tools are basically for handling HIGHEST quality binary raw format from the scientific archives.

The elegant pipe mechanism of my "toolbox" makes something like "png2pow2" entirely superfluous. Here is how you do it:

Let 'input.png' be a RGB .png texture of size 5400x2700, say.

Then you type at the prompt:

png2bin < input.png | tx2pow2 3 5400 | bin2png 3 4096 > output.png

output.png is a 4096x2048 .png texture with compression level 6.


And if you want to know the sizes beforehand, install ImageMagic and you can have a (terribly slow) utility (identify) that tells you everything about the texture concerned.

Here is what you would get without any options applied:
> identify input.png

input.png PNG 5400x2700 5400x2700+0+0 DirectClass 8-bit 13.9479mb 1.440u 0:03

If you want to read out the texture size in a script, 'identify' can do this, too. Except, for monster textures for which my tools are designed (!), it will take longer than all the rest of the job ;-) . I used 'identify' in my virtualtex script that MANY people have used in the past.

LINUX:
----------
texturesize=`identify -format "%wx%h" $1`
texturewidth=`identify -format "%w" $1`
textureheight=`identify -format "%h" $1`

WINDOWS:
----------------
> identify -format %w input.png
5400
You can store the output size in a variable.

You can also easily determine the nearest power-of-two size in your script, once you know the texturesize. Probably you need a more intelligent command shell than the stupid DOS shell:

This is for the UNIX zsh:
Code:
        #
        power2=1
        while (( power2 < texturewidth )); do
                (( power2 <<= 1 ))
        done

The shell must know the bit shift operator <<, however. Otherwise, you simply multiply by a factor of two each time...

Anyway, for such extremely /simple/ script tasks one would not want to code a /specialized/ compiled program, simultaneously for 3 operating systems ... ;-) .

Really I don't see why one has to spoil users to an extent that they don't need to know the initial size of their textures! Ignorance hurts ;-) . Without using 'identify', all you need to input to the script, is the original texture size. The rest can be automatically traced in the script. Also, everyone uses these script's basically only ONCE (in a while), not regularly... Hence typing in a number is not all that much of a pain ...

F.


Top
 Profile  
 
 Post subject:
PostPosted: Wed, 09-04-08, 1:40 GMT 
Offline

Joined: Tue, 01-04-08, 23:47 GMT
Posts: 3
t00fri wrote:
bdm,

sorry but the intention of the tools is not to come up with another alround set for general image manipulations. The tools are basically for handling HIGHEST quality binary raw format from the scientific archives.

It doesn't matter what they are designed for. Often users find ways of applying software tools in ways the designers didn't expect. With just a little work, these tools can be incorporated into a script that can split up a texture of a fixed size into several levels of textures, all placed in the correct level directories, complete with a CLX file as well. If I can do that using the limited capabilities of a DOS script, it's not hard to do it in a shell script as well.

Now we don't really need to figure out the width of the image. Other tools can do this as well. But the F-TexTools do also require the users to know the number of bytes per pixel and this is not easy to find out.

t00fri wrote:
And if you want to know the sizes beforehand, install ImageMagic and you can have a (terribly slow) utility (identify) that tells you everything about the texture concerned.

Here is what you would get without any options applied:
> identify input.png

input.png PNG 5400x2700 5400x2700+0+0 DirectClass 8-bit 13.9479mb 1.440u 0:03

If you want to read out the texture size in a script, 'identify' can do this, too. Except, for monster textures for which my tools are designed (!), it will take longer than all the rest of the job ;-) . I used 'identify' in my virtualtex script that MANY people have used in the past.

We don't need to install Imagemagick just to get access to the Identify tool.

We should be able to write a simple utility to interrogate the IHDR chunk in the PNG file to retrieve the height and width. We also need to include the capability to retrieve the colour type as well and convert it to the "channels" required in many places in the suite. Knowing the colour type is crucial to the correct operation of the F-TexTools suite but unless I missed something in the documentation at present the only way to determine the colour type using the F-TexTools suite is by trial and error. And when you're working with humungous texture files that take a while to process, users may get frustrated if they need to try it more than once. In fairness, there are likely to be only two values to try: "3" and "4".

If such a utility is provided - let's call it "pnginfo" - it would not use the arcane conventions of "Identify", but instead can be tailored to the conventions of F-TexTools. It needs to output height, width and bytes per pixel to stdout.

Users can then cut-and-paste the information into other parts of the script. Instead of guessing the correct values, they can call pnginfo and get that info directly.

Suppose pnginfo had the following format:
Height: 5400 Width: 2700 Channels: 3

Then we can do this:
png2bin < input.png | tx2pow2 `pnginfo input.png | cut ...` `pnginfo input.png | cut ...` | bin2png `pnginfo input.png | cut ...` 4096 > output.png

(I forget the parameters for the cut command but hopefully the intention is clear)

Or, if you do use a good shell script that supports variables, we can assign to variables the width and number of channels, and reference the variables instead of using `pnginfo input.png | cut ...` each time.

t00fri wrote:
Really I don't see why one has to spoil users to an extent that they don't need to know the initial size of their textures!

But what if the users don't know the initial size, or, more to the point, the number of channels used by the texture? While "identify" provides the width of the texture in an obvious format, I doubt many people would be able to use it to work out how many bytes are used per pixel. The one weakness of the F-TexTools script is the need to pass the number of channels to most of the commands. How do we work out this number from a PNG file?

I can manage without needing to have the size of the PNG value output; after all I can read the size of the PNG file from the IHDR chunk in the PNG file itself using a hex editor if I need to. However, figuring out the correct values for bytes per pixel is a bit more arcane and it may be helpful to provide a simple utility to determine this, or at least document briefly how this value is calculated from the Bit depth and Color type bytes.

Reference:
http://www.libpng.org/pub/png/spec/1.2/ ... tml#C.IHDR


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 18 posts ]  Go to page 1, 2  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group