In this work, a derived derivative-free conjugate gradient method for large-scale symmetric non-linear equationsis proposed. The basic idea of the method combines the Newton’s direction of conjugate gradient method and Quasi Newton from the work of Andreiusing a standard secant equation. The aim is to reduce the number of iterations, the CPU time and function evaluation of a given functions. The search direction is obtained using the normal frame of the conjugate gradient method via a new non-monotone line-search procedure. The proposed scheme was implementedusing MATLAB and the computational results for thesetofproblems show that the algorithm substantially outperforms the known conjugate gradient methods by Andrei. The derivative-free nature of the proposed method gives it advantage to solve relatively large-scale problems by avoiding the computation of Jacobian inverse. The computed parameter 𝛽𝑘improved the efficiency of the algorithm by reducing the function values significantly. As compared to some existing methods, the numerical results on the given benchmark test problems show that the proposed algorithms arepractically effective. It is accurate in terms of time of computation and valid in terms of number of iterations. Thus, suitable forsolving nonlinear equationsproblems.