Realization of minimization of logical functions by the Quine-McCloskey method with incomplete input set

This article is, to some extent, a continuation of my article on the minimization of logical functions by the method. Quine-McCloskey . It dealt with a case with completely defined logical functions (although this was not explicitly mentioned, but only implied). In reality, such a case is rare enough when the number of input variables is small. Partially or not completely defined are logical functions whose values are given only for part Q of the complete set P =

Possible sets (terms) of their arguments (variables) with the number * N, * , that is, Q < P. Такая ситуация встречается на практике в большинстве случаев применений алгоритмов оптимизации логических функций. Действительно, например, если число входных переменных * N, * = 3? which is an ordinary case, for example, in the financial markets, then the volume of the input training sample should be of the order of

>

elementary unique terms. Such an array of data is not found in every even a very large organization, let alone private individuals, ie, this is already the sphere of BigData, the use of data centers, etc.

Therefore, in practice, most often minimized logical functions will not be defined completely simply due to the lack of the necessary amount of accumulated data or due to various other objective reasons (for example, there is not enough space for their storage). The question arises about the possibility of "bypassing" this trouble by using an algorithm that works with a fully defined set of a term function, such as, for example, from my previous article.

here ).

The approximate logic of his work is as follows:

The source table is considered the current transformable TP, many rows of coverages are empty;

In the current table, the column with the smallest number of units is selected. Among the rows containing the units in this column, one is allocated with the largest number of units. This line is included in the coverage, the current table is shortened by deleting all the columns in which the selected row has one.

If there are columns that are not crossed out in the table, then step 1 is executed, otherwise the cover is constructed. Note: When calculating the number of units in a row, units in non-collapsed columns are counted.

This algorithm works quickly enough and gives a result close to optimal.

To test the operation of the algorithm, it is proposed to use the test function TestQuineMcCluskeyRandomPart, which from the entire set of possible terms with the quantity 2N randomly selects only the specified part 0

TestQuineMcCluskeyRandomPart (click to view) [/b]` public static void TestQuineMcCluskeyRandomPart (int iVariableAmount, double dPart = 1)`

{

if (dPart < 0) throw new

.ArgumentException ("The dPart parameter must be greater than 0 and less than 1");

if (dPart> 1) dPart = 1;

//Obtain the original terms

.Ilong iTotalCombines = (ulong) 1 iVariableAmount; .

LinkedList <. byte[]> pTrueCol = new LinkedList <. byte[]> ();.

LinkedList <. byte[]> pFalseCol = new LinkedList <. byte[]> ();

HashSet pUsedTerms = new HashSet (

) Random rnd = new Random ();

.byte[].buf = new byte[8];

While (pUsedTerms.LongCount () < (iTotalCombines * dPart))

{

.rnd.NextBytes (buf);

Ulong iCurValue = (ulong) BitConverter.ToInt64 (buf, 0)% iTotalCombines;

if (pUsedTerms.Contains (iCurValue)) continue;

.PUsedTerms.Add (iCurValue);

byte[]sLine = new byte[iVariableAmount];

for (int i = 0; i < iVariableAmount; i++)

{

SLine[i]+ = (Byte) (iCurValue% 2);

iCurValue = 1;

}

if (rnd.Next (2)! = 0)

{

pTrueCol.AddLast (sLine);

}

else

{

pFalseCol.AddLast (sLine);

}

}

//Start the training

DateTime DtStart = DateTime.Now;

Console.WriteLine ("Start -" + DtStart.ToLongTimeString ());

Quine_McCluskey Logic = new Quine_McCluskey ();

Logic.Start (pTrueCol, pFalseCol);

DateTime DtEnd = DateTime.Now;

Logic.PrintResult ();

Console.WriteLine ("Start -" + DtStart.ToLongTimeString ());

Console.WriteLine ("Completion -" + DtEnd.ToLongTimeString ());

TimeSpan Elapsed = DtEnd - DtStart;

Console.WriteLine ("Duration -" + String.Format ("{0:00}: {1:00}: {2:00}",

Elapsed.Hours, Elapsed.Minutes, Elapsed.Seconds));

//Get the results

int iErrorsCounter = 0;

foreach (byte[]kvp in pTrueCol)

{

if (Logic.Result.Calculate (kvp)! = true) iErrorsCounter ++;

}

foreach (byte[]kvp in pFalseCol)

{

if (Logic.Result.Calculate (kvp)! = false) iErrorsCounter ++;

}

Console.WriteLine ("Number of input terms =" + pUsedTerms.Count);

Console.WriteLine ("Number of disjunctive terms =" + Logic.Result.Terms.Count);

Console.WriteLine ("Number of errors =" + iErrorsCounter);

Console.ReadLine ();

}

As a result of the test function, the number of terms in the minimal disjunctive normal form and the number of errors in covering it with the initial set of terms are calculated.

In conclusion I would like to note that in practice this implementation of the algorithm has proved to be an effective and reliable means of minimizing the logical functions specified by incomplete or truncated set. As a disadvantage, it can be noted that the Skleivanie function should check for no cover errors for each virtual term of the entire list of initial terms at each iteration of the algorithm, which leads to significant time costs with a large number of input terms.

It may be interesting

#### weber

Author**26-09-2018, 19:24**

Publication Date
#### Machine learning / Algorithms / Data Mining / C#

Category- Comments: 6
- Views: 296