Subdivision methods for solving polynomial equations

Subdivision methods for solving polynomial equations

SUBDIVISION METHODS FOR SOLVING POLYNOMIAL EQUATIONS B. Mourrain, J.P. Pavone , 2009 1 ABSTRACT The purpose of the talk today is to present a new algorithm for solving a system of polynomials in a domain of It uses: a powerful reduction strategy based on univariate root finder

using Bernstein basis representation And Descarte's rule of signs. 2 MOTIVATION solving a system of polynomials is a subject that has been studied extensively throughout history and has practical applications in many different fields: Physics Computer Graphics

mechanics The film and game industry Robotics Financial information processing

Bioinformatics Coding Signal and image processing Computer vision dynamics and flow

And many ..many more 3 MOTIVATION We will briefly review two examples of such uses to emphasize the importance of solving these problems efficiently, quickly and precisely. 4

EXAMPLE #1 - GPS Our GPS device receives a satellite signal with the following known information packet: Satellite identity (serial number k) Satellite coordinates in space relative to the center of the unknown earth.

The time point where the information was transmitted in relation to an agreed global clock 5 EXAMPLE #1 - GPS We want to find our position in space (x, y, z). In addition, suppose that the point of time in which we receive the information from all the satellites is unknown and is indicated by t. We get: Distance between the satellite and the device according to coordinates The distance between the

satellite and the instrument according to time differences from transmission to reception 6 EXAMPLE #1 - GPS Each satellite will give us one equation in four variables. If we subtract the first satellite equation from the rest of the equations, we will obtain a simple linear equations system of four variables. From five satellites we can offer a solution of a simple system. Of course, our location changes in a fraction of a second and relies on a larger number of

satellites. Hence, we must be able to solve large equations systems quickly. 7 EXAMPLE #2 - COMPUTER GRAPHICS Computer graphics is an area that deals with the study of methods for creating and processing digital visual content by computer. In order to create a "smooth" view of different objects, a display with a high refresh rate is required, meaning that any change of the arena and / or the viewer's perspective should be displayed in second fraction. This is a refresh rate of 50 or 60 frames per second, which creates display that looks completely smooth. 8

EXAMPLE #2 - COMPUTER GRAPHICS A common method for representing objects in computer graphics is to assemble them from different polygons. Each polygon is represented by an equation and we need to be able to find their intersection (solving system of equations). For example, for the purpose of emphasizing certain parts of an object. 9 INTRODUCTION As stated, solving system of polynomials is the

basis of many geometric problems. In the solution discussed today, We exploit the properties of polynomial representations in the Bernstein basis, to deduce easily information on the corresponding real functions in a domain of in . 10 INTRODUCTION In previous lectures we dealt extensively

with Bernstein's representation. It is known to be more numerically stable. Extensive use of these curves was made following the work of the French engineer Pierre Bezier in 1962 who used them to design a Renault vehicle body. Their properties (control points, convex hull , etc.) combined with the reduction methods explain the varied amount of algorithms proposed to solve univariate systems. 11 The situation in the multivariate case has not been studied so extensively. Two main subfamilies coexist:

family which is based on family which is based on subdivision techniques Reduction approaches 12

SUBDIVISION TECHNIQUES The subdivision approaches use an exclusion test The result of the test is: no solution exists or there may be a solution. If the result is that there is no solution then we will reject the given domain. otherwise, we will divide the domain. We will continue to carry out the process until a certain criterion is accomplished (size of the field or much more complex).

The method provides algorithms with multiple iterations, especially in cases of multiple roots. However, the iteration price is significantly lower than the second approach. 13 REDUCTION APPROACHES The power of the method is based on the ability to focus on the parts of the domain where the roots are located. A reduction can not completely replace division because it is not always possible to reduce the

given domain reduction significantly reduces the number of iterations and of course it has drastic effects on performance. 14 BERNSTEIN POLYNOMIAL REPRESENTATION So far, we have dealt with Bernstein polynomials in the limited interval [0,1]. Let us now look at a general representation of a polynomial in any [a, b] section. Each univariate polynomial from degree d can be represented as follows:

for each , We will indicate the Bernstein polynomials: And then , 15 LEMMA #1- DESCARTE'S RULE OF SIGNS The number of real roots of in is bounded by the

number of sign changes of As a consequence, if there is no root in and if , there is one root in 16 AN EXAMPLE OF THE LEMMA IN STANDARD REPRESENTATION USING MONOMIAL The set of coefficients signs is : , hence there

is one positive root. For negative roots we will look on and then the set of coefficients' signs is : So there are two negative roots Indeed, and the roots are: (-1) from multiplicity 2 and 1. 17 If we extend the representation to the multi-dimensional case,

to any polynomial where from degree It can be represented as follows: 18 DEFINITION For any polynomial

and : M 19 The picture illustrates projection of the control points and the enveloping univariate polynomials 20 PROJECTION LEMMA

For any and we have 21 PROJECTION LEMMA - PROOF First, recalled that previously we have shown that for any k = 1, ..., n . And then:

In the same way we can show the second direction of the bounder of the minimum polynomial. 22 COROLLARY For any root of the equation in domain D , we have where: is a root of and is a root of in , if There is no root in if

23 UNIVARIATE ROOT SOLVER Our approach is based on an effective solution for finding univariate roots. Common methods to approximate the root are based on bisection. We perform these methods by: splitting the interval into two sub-sections selecting the sub-section containing the

root. The division can be done in several ways: 24 METHOD #1- IN BERNSTEIN BASIS USING DE CASTELJAU ALGORITHM This is a recursive algorithm that allows us to evaluate a Bernstein polynomial. The algorithm is based on the formula: The coefficients at a given level are obtained as barycenter of two consecutive of the previous level. We repeat the process until we reach one point.

25 METHOD #1- IN BERNSTEIN BASIS USING DE CASTELJAU ALGORITHM C# implementation: The algorithm requires arithmetic operations 26 METHOD #2- IN MONOMIAL BASIS USING HORNER METHODS

For the polynomial and some point , we will do the following steps: And get 27 EXAMPLE -HORNER METHODS The polynomial

and the point , ( 3 )=5 28 EXAMPLE -HORNER METHODS From the algorithm we obtain additional important information. If we divide by the remainder of the division will be the last coefficient on the bottom row and the other

coefficients represent a polynomial of one degree less which is the outcome of the division. In our example we get : 29 HORNER METHODS C implementation: The algorithm requires arithmetic operations 30 HOW TO USE EVALUATION

METHODS FOR ROOT FINDING For polynomial with roots we will do the next algorithm: Find by another method with a initial guess (NewtonRaphson) Use Horner methods to find Go to the first step with and Repeat the steps until you find all the roots. 31

METHOD #3- A SECANT-LIKE METHOD A two-point method in which we draw a line between two points and the position where the secant intersects the X axis becomes our new point. We repeat the steps in an iterative way. C implementation: 32 METHOD #4 Computing iteratively the first intersection of the convex hull of the control polygon, with the

x-axis and in subdividing the polynomial representation at this point 33 METHOD #5 - NEWTON-LIKE METHOD IN MONOMIAL BASIS single-point methods where we split the domain at the point where the tangent cuts the X axis. If there is no such point, we will cross in the middle of the section.

34 Experimentations on polynomials with random roots shows superiority of Horner and Newton iterations (Methods 2, 5). These methods allow us to solve more than equations with the precision of ( experiments were performed on an Intel Pentium4 2.0 GHz processor). the Newton outcome the Horner method in simple situations. 35

MULTIVARIATE ROOT FINDING Now we will discuss system of s polynomial equations with n variables and coefficients in . . We are looking for an approximation of the real root of in the domain with precision .

36 GENERAL SCHEMA OF THE SUBDIVISION ALGORITHM Step #1- applying a preconditioning step on the equations Step #2 -reducing the domain Step #3- if the reduction ratio is too small, splitting the domain

37 We are starting from polynomials with exact rational coefficients Converted them into Bernstein basis using exact arithmetic Round their coefficients up and down to the closest machine precision floating point numbers. Preconditioning, reduction or subdivision steps will be performed on these enveloping polynomials obtained. All along the algorithm, the enveloping

polynomials are upper and lower bounds of the actual polynomials. 38 GENERAL SCHEMA OF THE SUBDIVISION ALGORITHM Step #1- applying a preconditioning step on the equations Step #2 -reducing the domain Step #3- if the reduction ratio is too small,

splitting the domain 39 PRECONDITIONER First, we convert the equation system into a matrix of size . such a transformation may increase the degree of some equations. we may receive a sparse matrix. To avoid producing a sparse matrix we might prefer a partial preconditioner on a subsets of the equations sharing a subset of variables. For simplicity, let us suppose that polynomials are

already expressed in Bernstein's base and we will now discuss two types of transformations: 40 Step #1- applying a preconditioning step on the equations Global transformation

Local straightening 41 GLOBAL TRANSFORMATION A typical difficult situation appears when two of the functions have similar graphs in domain D. A way to avoid such a situation is to transform these equations in order to increase the differences between them. 42

GLOBAL TRANSFORMATION For , let: . In order to use the above formula, we define a polynomial norm at the Bernstein polynomial base B in the following way: . is the vector of the control coefficients of function f. This norm on the vector space of polynomials generated by the basis B is associated to a scalar product that we denote by . 43

GLOBAL TRANSFORMATION Goal: improve the angle between the vectors by creating a system that is orthogonal for <|>. 44 PROPOSITION

Let and let E be a matrix of unitary eigenvectors of Q. Then is a system of polynomials which are orthogonal for the scalar product . 45 PROOF Let . Then the matrix of scalar products is where are the positive . eigenvalues of Q This shows that the system is orthonormal for the scalar product

46 The picture illustrates the impact of the global preconditioner on two bivariate functions which graphs are very closed to each other before the preconditioning, and which are well separated after this preconditioning step. 47 LOCAL STRAIGHTENING

We consider square systems, for which . Since we are going to use the projection Lemma, interesting situation for reduction steps, are when the zero-level of the functions are orthogonal to the directions. We illustrate this remark : 48 LOCAL STRAIGHTENING In case (a) - the reduction based on the corollary from

the projection Lemma, will be of no use. The projection of the graphs cover the intervals. In case (b) - a good reduction strategy will yield a good approximation of the roots. 49 The idea of this preconditioner is to transform the system, in order to be closed to the case (b). We transform locally the system f into a syste where is the Jacobian matrix of f at the point . A direct computation shows that locally (in a

neighborhood of ), the level-set of are orthogonal to the -axes. 50 GENERAL SCHEMA OF THE SUBDIVISION ALGORITHM Step #1- applying a preconditioning step on the equations Step #2 -reducing the domain

Step #3- if the reduction ratio is too small, splitting the domain 51 REDUCING THE DOMAIN A possible reduction strategy is: Find the first root of the polynomial Find the last root of the polynomial In the given domain

reduce the domain to As we proof at the projection Lemma. An effective search of the roots will result a fast and efficient process. 52 REDUCING THE DOMAIN Another approach is IPP Interval Projected Polyhedron. This method is based on the use of the convex hull

feature. we will reduce the search domain of the root to a domain where the convex hull cuts the axis. 53 The following picture shows the improvement of the first approach compared to IPP that can lead to a significant change in the algorithm.

54 GENERAL SCHEMA OF THE SUBDIVISION ALGORITHM Step #1- applying a preconditioning step on the equations Step #2 -reducing the domain Step #3- if the reduction ratio is too small, splitting the domain 55

SUBDIVISION STRATEGY check the sign of the control coefficients of in the domain. If for one of the nonzero polynomial , its control coefficient vectors has no sign change: Then D does not contain any root - should be excluded. Otherwise, split the domain in half in the direction j where

is maximal and larger than a given size. 56 SUBDIVISION STRATEGY Another variant of the method is based on a division of the domain in the direction j where and if the coefficients of are not all positive and of are not all negative. This method allows us to accept domains that are more suited to root geometry, but a post - processing step for

gluing together connected domains may be required. 57 GOOD TO KNOW: The algorithm we presented was implemented in the C ++ library synaps: http://www-sop.inria.fr/teams/galaad-static/ 58

EXPERIMENTS Our objective in these experimentations is to evaluate the impact of reduction approaches compared with subdivision techniques, with and without preconditioning. The different methods that we compare are the following: sbd stands for subdivision rd stands for reduction. sbds stands for subdivision using the global preconditioner rds stands for reduction using the global

preconditioner 59 For each method and example, we present the following data: Number of iterations in the process number of

subdivisions steps the time in milliseconds on a Intel Pentium 4 2.0 GHz with 512 MB RAM number of domains computed

result size 60 Details: The most interesting characterization for us is the number of iterations. Simple roots are found using bisection. changing the algorithm can improve the process time. It will not change the number of iterations. The subdivision rules are based on a splitting

according to the largest variable. 61 RESULTS implicit curve intersection problems defined by bihomogeneous polynomials Case #1 Case #2 62

RESULTS intersection of two curves Case #3 Case #4 63 RESULTS

Case #5 - self intersection Case #6 rational curves 64 ANALYSIS OF RESULTS

The superiority of a reduction approach was observed. mainly reduction with local strategies (rdl) that converged at the lowest number of iterations. In complex cases (case 4 for example) reduction and subdivision methods with preconditioning (sbds, rdl, rds) have better performance compared with the classical reduction or subdivision (rd, sbd). 65

ANALYSIS OF RESULTS In most examples only rds, rdl provide a good answer. In these examples, the maximal difference between the coefficients of the upper and the lower polynomials (which indicates potential instability during the computations) remains much smaller than the precision we expect .on the roots Since our reduction principle is based on projection,

difficulties may arise when we increases the number of variables in our systems 66 Another experiment presents problems in 6 variables with a degree that comes from the robotics community. In this example, we will remain with the three reduction methods: rd, rds, rdl and add combinations of them that use projections: Before and after local straightening (rd + rdl) Before and after global preconditioning (rd + rds) The three techniques (rd + rdl + rds).

67 The combination of projections is an improvement. The global preconditioner tends to be better than local straightening. If we look at the first table, we see that the number of iterations of rdl and of rd are close, but it is not the case for rds. The reason is that rdl uses a local information (a Jacobian evaluation) while rds use a global information computed using all the coefficients in the system. 68

We first experiment that both subdivisions and reductions methods are bad solutions if they are not preconditioned. But using the same preconditioner reduction will, most of the time, beat subdivision. 69 CONCLUSION We presented algorithms for solving polynomial equations based on subdivision methods. The innovation in this approach composed of two main additions: Integration of preconditioning steps and

transformation of the system to achieve convergence in fewer iterations. Improving the use of reduction strategies. We have shown improvement to the reduction strategy by basing on methods for finding roots in multivariate systems. 70 CONCLUSION In the past there have been articles in the field that argued that the strategy of reduction is not interesting because these methods do not prevent many subdivisions.

In our experiments it can be clearly seen that reduction may save many divisions. We emphasize that despite the many advantages of the reduction strategy, the experiments demonstrate that it is better to use a preconditioning subdivision than a pure reduction. 71 In conclusion, we need to understand what kind of problem we are dealing with and adapt the best strategies to solve it with the tools we

have acquired today. 72

Recently Viewed Presentations

  • Mathematical Preliminaries - University of Minnesota

    Mathematical Preliminaries - University of Minnesota

    Times Symbol Times New Roman Arial Narrow Lec01MathMod05 Microsoft Equation Mathematical Preliminaries Linear Algebra Vectors Matlab Intro Data Almost all commands Vectorized Useful commands Stats Commands PowerPoint Presentation PowerPoint Presentation PowerPoint Presentation PowerPoint Presentation PowerPoint Presentation Bayes Example ...
  • chp 1

    chp 1

    Kids learn how to consume through the socialization process. They learn how to act as consumers from the environment around them. Parents are a big influence as are teachers. Kids learn a great deal from television, which for many is...
  • RAISON DETRE OF THE IMMUNE SYSTEM: To Distinguish

    RAISON DETRE OF THE IMMUNE SYSTEM: To Distinguish

    Y No binding Activation of T-cells requires signaling through the TCR and co-stimulatory molecules T-cell anergy T-cell activation APC Class II APC Class II CD4 CD28 CD80:86 CD4 Memory Is established through the clonal expansion of activated T or B...
  • Barbaras Story Helen Vaughan Joint Commissioning Manager for

    Barbaras Story Helen Vaughan Joint Commissioning Manager for

    Barbara's Story Helen Vaughan Joint Commissioning Manager for Dementia, Gloucestershire CCG Sarah Warne Head of Safeguarding (Adults and Children), Gloucestershire Care Services NHS Trust
  • A Study of Feature Processing on Accent Classification Given ...

    A Study of Feature Processing on Accent Classification Given ...

    CSCE 666. Term Project Presentation. Dec 11th, 2013. Background. Motivation: Accent Recognition(AR) helps improve Speech Recognition system and Speaker Identification system . Our work. Word-level Classifiers are built using different types of features, words and learning methods.
  • For the Instructor: Pedagogical research indicates that students

    For the Instructor: Pedagogical research indicates that students

    For the Instructor: Pedagogical research indicates that students learn better if they are actively engaged. Two suspicious dogs and a shredded book provide a perfect combination for focusing on multiple aspects of the process of science and to do so...
  • Network Monitoring - Southern Oregon University

    Network Monitoring - Southern Oregon University

    Network and system monitoring ... into what you are doing Initial Reconnaissance Corporate structure Web surfing Web browser www.copernic.com whois host NetScanTools Pro DNS information nslookup DNS information Should return minimal info if well configured System Enumeration Using information ...
  • VIPTA - Verizon Internet Provider Test Access  What

    VIPTA - Verizon Internet Provider Test Access What

    VIPTA - Verizon Internet Provider Test Access What is VIPTA? Verizon Internet Provider Test Access (VIPTA) is a web-based service order signoff procedure that automatically completes the order to the customer once Verizon has completed all required testing.