Consider the following extract from a real world program:
DO i=1,n
b(i)=a(i)
a(i-1)=a(i-1)+b(i)
a(i)=a(i)+b(i)
a(i+1)=a(i+1)+b(i)
END DO
print *,(a(i),i=1,n)
The system found after the application of the normalization and detection
module at pseudo-depth 0 is:
x[i] =
case
i | i=1 : a[1]+a[2] ;
i | i>=2 :
Scan( i' | 0<=i'<=n , ( [1] ),
+, a[i'+1], a[1] ) [i] ;
esac ;
print
case
i | i=1 : a[1] + a[1] + x[1] ;
i | 2<=i<=n-1 : x[i-1] + x[i-1] + x[i] ;
i | i=n : x[n-1] + x[n-1] ;
esac ;
A scan hidden by some manipulations on the elements of the array a
is detected.
The results for some other examples (mostly from the Argonne benchmarks
[3]) can be found in [19]. All of the 18 loops which
are classified as reductions in the Argonne benchmarks can be solved
by our software with one exception (the DFG analysis cannot be done because
of the presence of a goto in the main loop).
c[i,j] = Scan( i',j',k' | 1<=i'<=n, 1<=j'<=n, 0<=k'<=n ,
( [0 0 1] ), +, a[i',k'] * b[k',j'], 0 )[i,j,n] ;
Besides, it seems probable --- but it has to be investigated ---
that most variants of this code (e.g. all those obtained by permuting
the loops) will normalize to the same or to very similar equations.
It thus seems that we are at the threshold of being able of recognizing
simple algorithms from linear algebra.