## For donkeys. No parking

For donkeys. No parking.

This is the gate of a small computer shop in Almaty, Kazakhstan.

## Matlab — Scilab survival phrasebook

This is a one-page dictionary in case you need to switch between Matlab and Scilab.

Matlab Scilab Function
`acot(A)` `atan((1) ./A)` Inverse cotangent
`acoth(A)` `atanh((1) ./A)` Inverse hyperbolic cotangent
`acsc(A)` `asin((1) ./A)` Inverse cosecant
`acsch(A)` `asinh((1) ./A)` Inverse hyperbolic cosecant
`all` `and` Test to determine if all elements are nonzero
`angle(A)` `atan(imag(A),real(A))` Phase angle
`any` `or` Test to determine if any nonzeros elements
`asec(A)` `acos((1) ./A)` Inverse secant
`asech(A)` `acosh((1) ./A)` Inverse hyperbolic secant
`atan2(y,x)` `atan(y,x)` Four-quadrant inverse tangent
`[T,Ab]=balance(A)` `[Ab,T]=balanc(A)` Diagonal scaling to improve eigenvalue accuracy
`blkdiag` `sysdiag` Construct block diagonal matrix from input arguments
`cot` `cotg` Cotangent
`cputime` `timer()` Elapsed CPU time
`csc(A)` `(1) ./sin(A)` Cosecant
`csch(A)` `(1) ./sinh(A)` Hyperbolic cosecant
`date` `date()` Current date string
`dos` `unix_g` Execute a UNIX command and return result
`eig` `spec ; bdiag` Find eigenvalues and eigenvectors
`eval` `evstr ; execstr` Execute a string containing an instruction
`fclose` `mclose` Close one or more open files
`feof` `meof` Test for end-of-file
`ferror` `mclearerr ; merror` Query about errors in file input or output
`fft(A[,...])` `fft(A,-1[,...])` Discrete Fourier transform
`fgetl` `mgetl` Read line(s) from file, discard newline character
`fgets` `fgetstr` Read line from file, keep newline character
`fliplr(A)` `A(:,\$:-1:1)` Flip matrix in left/right direction
`flipud(A)` `A(\$:-1:1,:)` Flip matrix in up/down direction
`fopen` `mopen` Open a file or obtain information about open files
`frewind(fid)` `mseek("0",fid)` Move the file position indicator to the beginning of an open file
`fseek` `mseek` Set file position indicator
`ftell` `mtell` Get file position indicator
`hankel` `hank` Hankel matrix
`ifft(A[,...])` `fft(A,1[,...])` Inverse discrete Fourier transform
`iscell(A)` `typeof(A)=="ce"` Determine if input is a cell array
`ischar(A)` `type(A)==10` Determine if item is a character array
`ishandle(A)` `type(A)==9` Determines if values are valid graphics object handles
`isinteger(A)` `type(A)==8` Detect whether an array has integer data type
`isscalar(A)` `sum(length(A))==1`
`isstr(A)` `type(A)==10` Determine if item is a character array
`isstruct(A)` `typeof(A)=="st"` Determine if input is a structure array
`isunix` `getos() "Windows"` Determine if Unix version
`ispc` `(getos() == 'Windows')` Determine if PC (Windows) version
`kron(A,B)` `A .*. B` Kronecker tensor product
`lookfor` `apropos` Search for specified keyword in all help entries
`lower(str)` `convstr(str,"l")` Convert string to lower case
`mod` `pmodulo` Modulus after division
`nargin` `argn(2)` Number of function input arguments
`nargout` `argn(1)` Number of function output arguments
`null` `kernel` Null space of a matrix
`num2str` `string` Number to string conversion
`ones(size(A))` `ones(A)` Create an array of all ones
`otherwise` `else` Default part of switch/select statement
`pause` `xpause` Halt execution temporarily
`prod(A,1)` `prod(A,"r")` Product of array elements
`rand(A)` `rand(A[,"uniform"])` Uniformly distributed random numbers and arrays
`randn(A)` `rand(A,"normal")` Normally distributed random numbers and arrays
`realmax` `number_properties("huge")` Largest positive floating-point number
`realmin` `number_properties("tiny")` Smallest positive floating-point number
`rem(X,Y)` `X-fix(X./Y).*Y` Remainder after division
`reshape` `matrix` Reshape array
`strcmp(str1,str2)` `str1==str2` Compare strings
`strfind` `strindex` Find one string within another
`strrep` `strsubst` String search and replace
`switch` `select` Switch among several cases based on expression
`tic` `tic()` Starts a stopwatch timer
`toc` `toc()` Read the stopwatch timer
`unix` `unix_g` Execute a UNIX command and return result
`upper(str)` `convstr(str,"u")` Convert string to upper case
`end (index)` `\$` Last index
`eps` `%eps` Floating-point relative accuracy
`i ; j` `%i` Imaginary unit
`pi` `%pi` Ratio of a circle’s circumference to its diameter

## Our homemade 40/80 meter antenna

Recently, we made an antenna for 40 and 80 meter bands in our radio club, UN9GWA.

Here’s the video (in Russian but radio amateurs will understand what is going on):

The antenna is a closed loop of copper wire, a little bit longer than 80 meters.

The exact length was adjusted so that the internal tuner of our Icom IC-7600 was able to tune it for the 40 and 80 meter bands.

The antenna is stretched between our balcony on the ninth floor and the roofs of two neighboring buildings.

Because the wire is thin, the bandwidth is not large, and the antenna is not tuned for the whole 80 m band.

However, its length is adjustable. We soldered several terminals, half a meter apart, so we can change the length which shifts the resonant frequency within the band. This is what we do on the balcony in the video during our antenna party where we also made a fruit salad and cookies with marshmallow and chocolate by melting them under the sun.

Making antennas is fun. Seeing them work is even more fun!

Best wishes from our radio club!

73 de UN9GWA

## Logistic regression in Scilab

Let’s create some random data that are split into two different classes, ‘class 0’ and ‘class 1’.

We will use these data as a training set for logistic regression.

```
b0 = 10;
t = b0 * rand(100,2);
t = [t 0.5+0.5*sign(t(:,2)+t(:,1)-b0)];

b = 1;
flip = find(abs(t(:,2)+t(:,1)-b0)<b);
t(flip,\$)=grand(length(t(flip,\$)),1,"uin",0,1);

t0 = t(find(t(:,\$)==0),:);
t1 = t(find(t(:,\$)==1),:);

clf(0);scf(0);
plot(t0(:,1),t0(:,2),'bo')
plot(t1(:,1),t1(:,2),'rx')
```

The data from different classes overlap slightly. The degree of overlapping is controlled by the parameter `b` in the code.

We want to build a classification model that estimates the probability that a new, incoming data belong to the class 1.

First, we separate the data into features and results:

```
x = t(:, 1:\$-1); y = t(:, \$);

[m, n] = size(x);
```

Then, we add the intercept column to the feature matrix

```
// Add intercept term to x
x = [ones(m, 1) x];
```

The logistic regression hypothesis is defined as:

h(θ, x) = 1 / (1 + exp(−θTx) )

It’s value is the probability that the data with the features x belong to the class 1.

The cost function in logistic regression is

J = [−yT log(h) − (1−y)T log(1−h)]/m

where `log` is the “element-wise” logarithm, not a matrix logarithm.

If we use the gradient descent algorithm, then the update rule for the θ is

θθαJ = θα xT (hy) / m

The code is as follows

```
// Initialize fitting parameters
theta = zeros(n + 1, 1);

// Learning rate and number of iterations

a = 0.01;
n_iter = 10000;

for iter = 1:n_iter do
z = x * theta;
h = ones(z) ./ (1+exp(-z));
theta = theta - a * x' *(h-y) / m;
J(iter) = (-y' * log(h) - (1-y)' * log(1-h))/m;
end
```

Now, the classification can be visualized:

```
// Display the result

disp(theta)

u = linspace(min(x(:,2)),max(x(:,2)));

clf(1);scf(1);
plot(t0(:,1),t0(:,2),'bo')
plot(t1(:,1),t1(:,2),'rx')
plot(u,-(theta(1)+theta(2)*u)/theta(3),'-g')
```

Looks good.

The graph of the cost at each iteration is:

```
// Plot the convergence graph

clf(2);scf(2);
plot(1:n_iter, J');
xtitle('Convergence','Iterations','Cost')
```