I have a somewhat interesting query..
I may have over simplified the example, but ill do my best to describe my problem.
I am building a very simple implementation of a wiki from scratch, everything going well until I realized I need Cycle Detection to prevent endless loops of data populating the page and well Overflowing the stack heap.
Database structure is basic, well its more intricate that what is shown but for the purposes of this post the two columns is all we need.
The Content field is straight forward, it stores the Content of the Page or WikiPart links i.e [[n]] to link to another part and includes, links are shopwn as [[n]] and includes are {{n}}.
+---------------------------+
| id | Content |
+---------------------------+
| 1 | see {{2}} here |
+---------------------------+
| 2 | {{1}} here [[4]] |
+---------------------------+
| 4 | {{1}} |
+---------------------------+
$html_for_screen = readData($this->Content);
function readData($wikipage) {
$str = "";
//Convert any wiki links to HTML Links
$wikipage = Converter::convertWikink($wikipage);
//Get ALL Include Link matches into array
$wiki_inc = RegEx::getMatches(wikipage);
//Iterate through the Matches
foreach($wiki_inc as $wiki) {
//traverse through each match.
//but I assume here is where I would eventually have the trouble
//With infinant loops
$str .= readData($wiki);
}
return $str;
}
The Question: How would I prevent Wiki parts endlessly including eachother. i.e WikiPart 1 includes WikiPart2.. but WikiPart 2 includes WikiPart1
The parse or readData() function would just continue looping.
regards