This post is an *introduction* for understanding some *important* and *unsolved*
problems in **Computer Science** and reasons why they matter.
I’m assuming you know a bit about
Turing machine - in short words
it’s an **abstract machine**, which can simulate any *algorithm* we can write
in any *programming language* using current model of *computation* (so for example on
your computer).

Matters in this post are quite hard to explain, so if you want me to elaborate more, just ask in comments.

In **Computer Science** scientists divided set of all **problems**
to subsets based on their **complexity**.
As every **problem** can be stated as **decision problem**. Let’s consider only these.

**Decision problems** are these which have an *output*: **YES** or **NO**.

The most known sets of **decision problems** are , and
and these sets I want to describe.

## problems

**Problems** in are those, which can be solved using **polynomial time** by
**deterministic Turing machine**.

Intuitively you can think that for problem in it’s possible to write an
*algorithm* which will use **polynomial time** to solve it.
Some common problems are:

- greatest common divisor
- minimum in list
- shortest path problem

Here is an example of *finding minimum in list* problem in Erlang language:

We can see that it uses **time**, because we *iterate* in `min_helper`

function
over input list exactly **once**. It just picks **head** `H`

of list - if
it’s less than minimum then puts it in accumulator `CurrentMin`

. Then it
does the same for the **tail** of list. When we finally reach an empty list
it returns minimum which is in `CurrentMin`

accumulator.

## problems

(**N**ondeterministic **P**olynomial time) **problems** are these for which we can **verify** if a solution is *correct* using
**polynomial time**.

In other words for **problems** we
will need a **non-deterministic Turing machine** to solve them in **polynomial time**
for sure
(such machine exists only in theory). This also means
that when solving problem in using **deterministic Turing machine** we *might* need
an **exponential time**. Be aware that all problems are also in set, because
we can use actual solving method as **verification**.

The most known problems are:

- traveling salesman problem - very common in real life!
- boolean satisfiability problem () - very important for AI
- vertex cover problem

These problems are -complete which means that they are in set and
**every problem** in is **not harder** than any of these problems.
**None** of problems are -complete.

-complete problems are *usually* in **exponential** time, but keep in mind,
that it’s not by their *definition*. They can be also in **subexponential** time.

## problems

problems are these, which can be solved by **deterministic
Turing machine** using **polynomial space** (memory).

Note that this time we are talking about **space** (**memory**) instead of time!
Some examples:

- Quantified Boolean formula problem ()
- Reversi (game)
- Checkers (game)
- Mahjong (game)
- Chess and GO (when we have
**constraint**for number of moves, otherwise they are proven to be even harder than )

## versus ? What about ?

Currently scientists have proven that between , and sets of problems there is a relation:

**Inclusion** can be intuitively understood as *being easier* relation. So any problem in is
*easier* than any problem which is in and not in . By *easier* intuitively
I mean that
we can write an algorithm to solve this problem, which will use *less time* to give solution.
For example algorithms which solve problems in can take for example time
(for example: finding minimum in list) while those
which solve -complete problems are usually in **exponential** time.

For me on the first sight wasn’t obvious why . But when I thought
that you can iterate for **exponential** time using **polynomial space**, it became clear
that problems are definitely not easier than problems in .

We still don’t know if these **two relations** are only *inclusions* or *equalities*
(there are some papers on =
for example NP = PSPACE, but there are a lot of discussions
on them).
If there will be a prove
that then all our **cryptography** (**SSL**) will be **unsafe**.
Proof of equality will have
big impact on **Artificial Intelligence** algorithms, but I want to cover it in
next posts.

### To be continued…

Next time I will describe some important problems for **automated theorem proving**
like and . The last one is also important for **AI**. I’m thinking about
writing my own -solver.

If you want me to cover something specific about this stuff or elaborate more don’t hesitate to leave a comment!

## Leave a Comment