Assembler programming VEX

Usage of VEX tools

Assembler programming VEX

Postby akoch » Tue Oct 24, 2006 9:44 pm

It appears that the limits set in the .mm file are only obeyed by the C-Compiler, but not by the simulator.

I had intended to have my students write a few short programs directly in VEX assembler, before advancing to the compiler. However, even excessively wide instructions (forgetting the ;; ) do not cause errors.

Additionally, it would be pedagogically useful to switch the simulator to EQ mode to demonstrate the effect of user-visible latencies. In the current simulation model of the LEQ machine, all simulated latencies appear to be 0 (compare results or memory reads are immediately available). While this is correct for an LEQ model, it does not serve to make the point described above: Everything works correctly even without xnops.

Is there any way to achieve this aim? Or is the current tool chain simply not set-up for experiments at the assembler level? Or, am I simply misunderstanding the documentation?

Any help would be appreciated :-)
akoch
 
Posts: 2
Joined: Tue Oct 24, 2006 6:27 pm

Re: Assembler programming VEX

Postby frb » Thu Oct 26, 2006 2:50 pm

akoch wrote:It appears that the limits set in the .mm file are only obeyed by the C-Compiler, but not by the simulator.

Indeed you are right. The compiled simulator is completely agnostic to the resource constraints. It is basically a functional simulator that counts timing as instructed by the compiler.

akoch wrote:I had intended to have my students write a few short programs directly in VEX assembler, before advancing to the compiler. However, even excessively wide instructions (forgetting the ;; ) do not cause errors.

Yes, see my point above. The CS is dumb: it just does what it's told without checking. Adding simple checking wouldn't be hard: the complication is more to make sure that it agrees with the compiler on what it's checking. And it would make it impossible to have libraries compiled for one machine and some code compiled for another machine, etc. (every change would require recompiling the entire libc -- not advisable...). Bottom line: the problem is not the checking, it's integrating in the rest of the environment.

akoch wrote:Additionally, it would be pedagogically useful to switch the simulator to EQ mode to demonstrate the effect of user-visible latencies. In the current simulation model of the LEQ machine, all simulated latencies appear to be 0 (compare results or memory reads are immediately available). While this is correct for an LEQ model, it does not serve to make the point described above: Everything works correctly even without xnops.

Is there any way to achieve this aim? Or is the current tool chain simply not set-up for experiments at the assembler level? Or, am I simply misunderstanding the documentation?


Checking latencies in CS would be much harder, although I agree it would be pedagogically useful. Again, CS relies that the compiler is not violating any constraint. At some point we did have a "latency warning" check, but if I recall correctly we turned it off because it was pretty hard to deal with latencies across branches and join points.

Given that you're talking an academic setting, maybe you could assign a
project to write an assembler checker that checks resource oversubscriptions and the latency within basic blocks. It should be reasonably easy to do with the assembler parser template that comes with the toolchain examples.

-- Paolo
frb
 
Posts: 62
Joined: Thu Nov 12, 2009 3:44 pm


Return to VEX Tools



Who is online

Users browsing this forum: No registered users and 7 guests

cron