Search results for: equivalent linear approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17257

Search results for: equivalent linear approach

16507 [Keynote Talk]: Ultrasound Assisted Synthesis of ZnO of Different Morphologies by Solvent Variation

Authors: Durata Haciu, Berti Manisa, Ozgur Birer

Abstract:

ZnO nanoparticles have been synthesized by ultrasonic irradiation from simple linear alcohols and water/ethanolic mixtures, at 50 oC. By changing the composition of the solvent, the shape could be altered. While no product was obtained from methanolic solutions, in ethanol, sheet like lamellar structures prevail.n-propanol and n-butanol resulted in needle like structures. The morphology of ZnO could be thus tailored in a simple way, by varying the solvent, under ultrasonic irradiation, in a relatively less time consuming method. Variation of the morphology and size of Zn also provides a means for modulating the band-gap. Although the chemical effects of ultrasound do not come from direct interaction with molecular species, the high energy derived from acoustic cavitation creates a unique interaction of energy and matter with great potential for synthesis.

Keywords: ultrasound, ZnO, linear alcohols, morphology

Procedia PDF Downloads 242
16506 Effects of Charge Fluctuating Positive Dust on Linear Dust-Acoustic Waves

Authors: Sanjit Kumar Paul, A. A. Mamun, M. R. Amin

Abstract:

The Linear propagation of the dust-acoustic wave in a dusty plasma consisting of Boltzmann distributed electrons and ions and mobile charge fluctuating positive dust grains has been investigated by employing the reductive perturbation method. It has been shown that the dust charge fluctuation is a source of dissipation and its responsible for the formation of the dust-acoustic waves in such a dusty plasma. The basic features of such dust-acoustic waves have been identified. It has been proposed to design a new laboratory experiment which will be able to identify the basic features of the dust-acoustic waves predicted in this theoretical investigation.

Keywords: dust acoustic waves, dusty plasma, Boltzmann distributed electrons, charge fluctuation

Procedia PDF Downloads 639
16505 Training Program for Kindergarden Teachers on Learning through Project Approach

Authors: Dian Hartiningsih, Miranda Diponegoro, Evita Eddie Singgih

Abstract:

In facing the 21st century, children need to be prepared in reaching their optimum development level which encompasses all aspect of growth and to achieve the learning goals which include not only knowledge and skill, but also disposition and feeling. Teachers as the forefront of education need to be equipped with the understanding and skill of a learning method which can prepare the children to face this 21st century challenge. Project approach is an approach which utilizes active learning which is beneficial for the children. Subject to this research are kindergarten teachers at Dwi Matra Kindergarten and Kirana Preschool. This research is a quantitative research using before and after study design. The result suggest that through preliminary training program on learning with project approach, the kindergarten teachers ability to explain project approach including understanding, benefit and stages of project approach have increased significantly, the teachers ability to design learning with project approach have also improved significantly. The result of learning design that the teachers had made shows a remarkable result for the first stage of the project approach; however the second and third design result was not as optimal. Challenges faced in the research will be elaborated further in the research discussion.

Keywords: project approach, teacher training, learning method, kindergarten

Procedia PDF Downloads 331
16504 Derivatives Balance Method for Linear and Nonlinear Control Systems

Authors: Musaab Mohammed Ahmed Ali, Vladimir Vodichev

Abstract:

work deals with an universal control technique or single controller for linear and nonlinear stabilization and tracing control systems. These systems may be structured as SISO and MIMO. Parameters of controlled plants can vary over a wide range. Introduced a novel control systems design method, construction of stable platform orbits using derivative balance, solved transfer function stability preservation problem of linear system under partial substitution of a rational function. Universal controller is proposed as a polar system with the multiple orbits to simplify design procedure, where each orbit represent single order of controller transfer function. Designed controller consist of proportional, integral, derivative terms and multiple feedback and feedforward loops. The controller parameters synthesis method is presented. In generally, controller parameters depend on new polynomial equation where all parameters have a relationship with each other and have fixed values without requirements of retuning. The simulation results show that the proposed universal controller can stabilize infinity number of linear and nonlinear plants and shaping desired previously ordered performance. It has been proven that sensor errors and poor performance will be completely compensated and cannot affect system performance. Disturbances and noises effect on the controller loop will be fully rejected. Technical and economic effect of using proposed controller has been investigated and compared to adaptive, predictive, and robust controllers. The economic analysis shows the advantage of single controller with fixed parameters to drive infinity numbers of plants compared to above mentioned control techniques.

Keywords: derivative balance, fixed parameters, stable platform, universal control

Procedia PDF Downloads 136
16503 Influence of P-Y Curves on Buckling Capacity of Pile Foundation

Authors: Praveen Huded, Suresh Dash

Abstract:

Pile foundations are one of the most preferred deep foundation system for high rise or heavily loaded structures. In many instances, the failure of the pile founded structures in liquefiable soils had been observed even in many recent earthquakes. Recent centrifuge and shake table experiments on two layered soil system have credibly shown that failure of pile foundation can occur because of buckling, as the pile behaves as an unsupported slender structural element once the surrounding soil liquefies. However the buckling capacity depends on largely on the depth of soil liquefied and its residual strength. Hence it is essential to check the pile against the possible buckling failure. Beam on non-linear Winkler Foundation is one of the efficient method to model the pile-soil behavior in liquefiable soil. The pile-soil interaction is modelled through p-y springs, different author have proposed different types of p-y curves for the liquefiable soil. In the present paper the influence two such p-y curves on the buckling capacity of pile foundation is studied considering initial geometric and non-linear behavior of pile foundation. The proposed method is validated against experimental results. Significant difference in the buckling capacity is observed for the two p-y curves used in the analysis. A parametric study is conducted to understand the influence of pile diameter, pile flexural rigidity, different initial geometric imperfections, and different soil relative densities on buckling capacity of pile foundation.

Keywords: Pile foundation , Liquefaction, Buckling load, non-linear py curve, Opensees

Procedia PDF Downloads 164
16502 Chemometric Analysis of Raw Milk Quality Originating from Conventional and Organic Dairy Farming in AP Vojvodina, Serbia

Authors: Sanja Podunavac-Kuzmanović, Denis Kučević, Strahinja Kovačević, Milica Karadžić, Lidija Jevrić

Abstract:

The present study describes the application of chemometric methods in analysis of milk samples which were collected in a conventional dairy farm and an organic dairy farm in AP Vojvodina, Republic of Serbia. The chemometric analysis included the application of univariate regression modeling and Analysis of Variance (ANOVA) method. The ANOVA was used in order to determine the differences in fatty acids content in the milk samples from conventional and organic farm. The results of the ANOVA testing indicate that there is a highly statistically significant difference between the content of fatty acid (saturated fatty acid vs. unsaturated fatty acids) in different dairy farming. Besides, the linear univariate models have been obtained as a result of modeling the linear relationships between the milk fat content and saturated fatty acids content, and the linear relationships between the milk fat content and unsaturated fatty acids content. The models obtained on the basis of the milk samples which originate from the organic farming are statistically better than the models based on the milk samples from conventional farming.

Keywords: hemometrics, milk, organic farming, quality control

Procedia PDF Downloads 236
16501 Convergence of Generalized Jacobi, Gauss-Seidel and Successive Overrelaxation Methods for Various Classes of Matrices

Authors: Manideepa Saha, Jahnavi Chakrabarty

Abstract:

Generalized Jacobi (GJ) and Generalized Gauss-Seidel (GGS) methods are most effective than conventional Jacobi and Gauss-Seidel methods for solving linear system of equations. It is known that GJ and GGS methods converge for strictly diagonally dominant (SDD) and for M-matrices. In this paper, we study the convergence of GJ and GGS converge for symmetric positive definite (SPD) matrices, L-matrices and H-matrices. We introduce a generalization of successive overrelaxation (SOR) method for solving linear systems and discuss its convergence for the classes of SDD matrices, SPD matrices, M-matrices, L-matrices and for H-matrices. Advantages of generalized SOR method are established through numerical experiments over GJ, GGS, and SOR methods.

Keywords: convergence, Gauss-Seidel, iterative method, Jacobi, SOR

Procedia PDF Downloads 189
16500 Inventory Management System of Seasonal Raw Materials of Feeds at San Jose Batangas through Integer Linear Programming and VBA

Authors: Glenda Marie D. Balitaan

Abstract:

The branch of business management that deals with inventory planning and control is known as inventory management. It comprises keeping track of supply levels and forecasting demand, as well as scheduling when and how to plan. Keeping excess inventory results in a loss of money, takes up physical space, and raises the risk of damage, spoilage, and loss. On the other hand, too little inventory frequently causes operations to be disrupted and raises the possibility of low customer satisfaction, both of which can be detrimental to a company's reputation. The United Victorious Feed mill Corporation's present inventory management practices were assessed in terms of inventory level, warehouse allocation, ordering frequency, shelf life, and production requirement. To help the company achieve their optimal level of inventory, a mathematical model was created using Integer Linear Programming. Due to the season, the goal function was to reduce the cost of purchasing US Soya and Yellow Corn. Warehouse space, annual production requirements, and shelf life were all considered. To ensure that the user only uses one application to record all relevant information, like production output and delivery, the researcher built a Visual Basic system. Additionally, the technology allows management to change the model's parameters.

Keywords: inventory management, integer linear programming, inventory management system, feed mill

Procedia PDF Downloads 83
16499 Antioxidant Activity of the Methanolic Extract and Antimicrobial Activity of the Essential Oil of Rosmarinus officinalis L. Grown in Algeria

Authors: Nassim Belkacem, Amina Azzam, Dalila Haouchine, Kahina Bennacer, Samira Soufit

Abstract:

Objective: To evaluate the antioxidant activity of the methanolic extract along with the antimicrobial activity of the essential oil of the aerial parts of Rosmarinus officinalis L. collected in the region of Bejaia (northern center of Algeria). Materials and methods: The polyphenols and flavonoids contents of the methanolic extract were measured. The antioxidant activity was evaluated using two methods: the ABTS method and DPPH assay. The antimicrobial activity was studied by the agar diffusion method against five bacterial strains (Three Gram positive strains and two Gram negative strains) and one fungus. Results: The total polyphenol and flavonoid content was about 43.8 mg gallic acid equivalent per gram (GA Eq/g) and 7.04 mg quercetin equivalent per gram (Q Eq/g), respectively. In the ABTS assay, the rosemary extract has shown an inhibition of 98.02% at the concentration of 500ug/ml with a half maximal inhibitory concentration value (IC50) of 194.92ug/ml. The results of DPPH assay have shown that the rosemary extract has an inhibition of 94.67 % with an IC50 value of 17.87ug/ml, which is lower than that of Butylhydroxyanisol (BHA) about 6.03ug/ml and ascorbic acid about 1.24μg/ml. The yield in essential oil of rosemary obtained by hydrodistillation was 1.42%. Based on the determination of the diameter of inhibition, different antimicrobial activity of the essential oil was revealed against the six tested microbes. Escherichia coli from the University Hospital (UH), Streptococcus aureus (UH) and Pseudomonas aeruginosa ATCC have a minimum inhibitory concentration value (MIC) of 62.5µl/ml. However, Bacillus sp (UH) and Staphylococcus aureus ATCC have an MIC value of 125μl/ml. The inhibition zone against Candida sp was about 24 mm. The aromatograms showed that the essential oil of rosemary exercises an antifungal activity more important than the antibacterial one.

Keywords: Rosmarinus officinalis L., maceration, essential oil, antioxidant, antimicrobial activity

Procedia PDF Downloads 522
16498 Proposed Algorithms to Assess Concussion Potential in Rear-End Motor Vehicle Collisions: A Meta-Analysis

Authors: Rami Hashish, Manon Limousis-Gayda, Caitlin McCleery

Abstract:

Introduction: Mild traumatic brain injuries, also referred to as concussions, represent an increasing burden to society. Due to limited objective diagnostic measures, concussions are diagnosed by assessing subjective symptoms, often leading to disputes to their presence. Common biomechanical measures associated with concussion are high linear and/or angular acceleration to the head. With regards to linear acceleration, approximately 80g’s has previously been shown to equate with a 50% probability of concussion. Motor vehicle collisions (MVCs) are a leading cause of concussion, due to high head accelerations experienced. The change in velocity (delta-V) of a vehicle in an MVC is an established metric for impact severity. As acceleration is the rate of delta-V with respect to time, the purpose of this paper is to determine the relation between delta-V (and occupant parameters) with linear head acceleration. Methods: A meta-analysis was conducted for manuscripts collected using the following keywords: head acceleration, concussion, brain injury, head kinematics, delta-V, change in velocity, motor vehicle collision, and rear-end. Ultimately, 280 studies were surveyed, 14 of which fulfilled the inclusion criteria as studies investigating the human response to impacts, reporting head acceleration, and delta-V of the occupant’s vehicle. Statistical analysis was conducted with SPSS and R. The best fit line analysis allowed for an initial understanding of the relation between head acceleration and delta-V. To further investigate the effect of occupant parameters on head acceleration, a quadratic model and a full linear mixed model was developed. Results: From the 14 selected studies, 139 crashes were analyzed with head accelerations and delta-V values ranging from 0.6 to 17.2g and 1.3 to 11.1 km/h, respectively. Initial analysis indicated that the best line of fit (Model 1) was defined as Head Acceleration = 0.465

Keywords: acceleration, brain injury, change in velocity, Delta-V, TBI

Procedia PDF Downloads 233
16497 Lateral Torsional Buckling Resistance of Trapezoidally Corrugated Web Girders

Authors: Annamária Käferné Rácz, Bence Jáger, Balázs Kövesdi, László Dunai

Abstract:

Due to the numerous advantages of steel corrugated web girders, its application field is growing for bridges as well as for buildings. The global stability behavior of such girders is significantly larger than those of conventional I-girders with flat web, thus the application of the structural steel material can be significantly reduced. Design codes and specifications do not provide clear and complete rules or recommendations for the determination of the lateral torsional buckling (LTB) resistance of corrugated web girders. Therefore, the authors made a thorough investigation regarding the LTB resistance of the corrugated web girders. Finite element (FE) simulations have been performed to develop new design formulas for the determination of the LTB resistance of trapezoidally corrugated web girders. FE model is developed considering geometrical and material nonlinear analysis using equivalent geometric imperfections (GMNI analysis). The equivalent geometric imperfections involve the initial geometric imperfections and residual stresses coming from rolling, welding and flame cutting. Imperfection sensitivity analysis was performed to determine the necessary magnitudes regarding only the first eigenmodes shape imperfections. By the help of the validated FE model, an extended parametric study is carried out to investigate the LTB resistance for different trapezoidal corrugation profiles. First, the critical moment of a specific girder was calculated by FE model. The critical moments from the FE calculations are compared to the previous analytical calculation proposals. Then, nonlinear analysis was carried out to determine the ultimate resistance. Due to the numerical investigations, new proposals are developed for the determination of the LTB resistance of trapezoidally corrugated web girders through a modification factor on the design method related to the conventional flat web girders.

Keywords: corrugated web, lateral torsional buckling, critical moment, FE modeling

Procedia PDF Downloads 283
16496 Neural Network in Fixed Time for Collision Detection between Two Convex Polyhedra

Authors: M. Khouil, N. Saber, M. Mestari

Abstract:

In this paper, a different architecture of a collision detection neural network (DCNN) is developed. This network, which has been particularly reviewed, has enabled us to solve with a new approach the problem of collision detection between two convex polyhedra in a fixed time (O (1) time). We used two types of neurons, linear and threshold logic, which simplified the actual implementation of all the networks proposed. The study of the collision detection is divided into two sections, the collision between a point and a polyhedron and then the collision between two convex polyhedra. The aim of this research is to determine through the AMAXNET network a mini maximum point in a fixed time, which allows us to detect the presence of a potential collision.

Keywords: collision identification, fixed time, convex polyhedra, neural network, AMAXNET

Procedia PDF Downloads 422
16495 Detecting Earnings Management via Statistical and Neural Networks Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange

Procedia PDF Downloads 421
16494 Evaluating the Dosimetric Performance for 3D Treatment Planning System for Wedged and Off-Axis Fields

Authors: Nashaat A. Deiab, Aida Radwan, Mohamed S. Yahiya, Mohamed Elnagdy, Rasha Moustafa

Abstract:

This study is to evaluate the dosimetric performance of our institution's 3D treatment planning system for wedged and off-axis 6MV photon beams, guided by the recommended QA tests documented in the AAPM TG53; NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Ten tests were applied on solid water equivalent phantom along with 2D array dose detection system. The calculated doses using 3D treatment planning system PrecisePLAN were compared with measured doses to make sure that the dose calculations are accurate for simple situations such as square and elongated fields, different SSD, beam modifiers e.g. wedges, blocks, MLC-shaped fields and asymmetric collimator settings. The QA results showed dosimetric accuracy of the TPS within the specified tolerance limits. Except for large elongated wedged field, the central axis and outside central axis have errors of 0.2% and 0.5%, respectively, and off- planned and off-axis elongated fields the region outside the central axis of the beam errors are 0.2% and 1.1%, respectively. The dosimetric investigated results yielded differences within the accepted tolerance level as recommended. Differences between dose values predicted by the TPS and measured values at the same point are the result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.

Keywords: quality assurance, dose calculation, wedged fields, off-axis fields, 3D treatment planning system, photon beam

Procedia PDF Downloads 445
16493 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement

Authors: Hu Zhenxing, Gao Jianxin

Abstract:

Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.

Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D

Procedia PDF Downloads 498
16492 Globally Convergent Sequential Linear Programming for Multi-Material Topology Optimization Using Ordered Solid Isotropic Material with Penalization Interpolation

Authors: Darwin Castillo Huamaní, Francisco A. M. Gomes

Abstract:

The aim of the multi-material topology optimization (MTO) is to obtain the optimal topology of structures composed by many materials, according to a given set of constraints and cost criteria. In this work, we seek the optimal distribution of materials in a domain, such that the flexibility of the structure is minimized, under certain boundary conditions and the intervention of external forces. In the case we have only one material, each point of the discretized domain is represented by two values from a function, where the value of the function is 1 if the element belongs to the structure or 0 if the element is empty. A common way to avoid the high computational cost of solving integer variable optimization problems is to adopt the Solid Isotropic Material with Penalization (SIMP) method. This method relies on the continuous interpolation function, power function, where the base variable represents a pseudo density at each point of domain. For proper exponent values, the SIMP method reduces intermediate densities, since values other than 0 or 1 usually does not have a physical meaning for the problem. Several extension of the SIMP method were proposed for the multi-material case. The one that we explore here is the ordered SIMP method, that has the advantage of not being based on the addition of variables to represent material selection, so the computational cost is independent of the number of materials considered. Although the number of variables is not increased by this algorithm, the optimization subproblems that are generated at each iteration cannot be solved by methods that rely on second derivatives, due to the cost of calculating the second derivatives. To overcome this, we apply a globally convergent version of the sequential linear programming method, which solves a linear approximation sequence of optimization problems.

Keywords: globally convergence, multi-material design ordered simp, sequential linear programming, topology optimization

Procedia PDF Downloads 315
16491 Investigation of the Material Behaviour of Polymeric Interlayers in Broken Laminated Glass

Authors: Martin Botz, Michael Kraus, Geralt Siebert

Abstract:

The use of laminated glass gains increasing importance in structural engineering. For safety reasons, at least two glass panes are laminated together with a polymeric interlayer. In case of breakage of one or all of the glass panes, the glass fragments are still connected to the interlayer due to adhesion forces and a certain residual load-bearing capacity is left in the system. Polymer interlayers used in the laminated glass show a viscoelastic material behavior, e.g. stresses and strains in the interlayer are dependent on load duration and temperature. In the intact stage only small strains appear in the interlayer, thus the material can be described in a linear way. In the broken stage, large strains can appear and a non-linear viscoelasticity material theory is necessary. Relaxation tests on two different types of polymeric interlayers are performed at different temperatures and strain amplitudes to determine the border to the non-linear material regime. Based on the small-scale specimen results further tests on broken laminated glass panes are conducted. So-called ‘through-crack-bending’ (TCB) tests are performed, in which the laminated glass has a defined crack pattern. The test set-up is realized in a way that one glass layer is still able to transfer compressive stresses but tensile stresses have to be transferred by the interlayer solely. The TCB-tests are also conducted under different temperatures but constant force (creep test). Aims of these experiments are to elaborate if the results of small-scale tests on the interlayer are transferable to a laminated glass system in the broken stage. In this study, limits of the applicability of linear-viscoelasticity are established in the context of two commercially available polymer-interlayers. Furthermore, it is shown that the results of small-scale tests agree to a certain degree to the results of the TCB large-scale experiments. In a future step, the results can be used to develop material models for the post breakage performance of laminated glass.

Keywords: glass breakage, laminated glass, relaxation test, viscoelasticity

Procedia PDF Downloads 121
16490 Digital Phase Shifting Holography in a Non-Linear Interferometer using Undetected Photons

Authors: Sebastian Töpfer, Marta Gilaberte Basset, Jorge Fuenzalida, Fabian Steinlechner, Juan P. Torres, Markus Gräfe

Abstract:

This work introduces a combination of digital phase-shifting holography with a non-linear interferometer using undetected photons. Non-linear interferometers can be used in combination with a measurement scheme called quantum imaging with undetected photons, which allows for the separation of the wavelengths used for sampling an object and detecting it in the imaging sensor. This method recently faced increasing attention, as it allows to use of exotic wavelengths (e.g., mid-infrared, ultraviolet) for object interaction while at the same time keeping the detection in spectral areas with highly developed, comparable low-cost imaging sensors. The object information, including its transmission and phase influence, is recorded in the form of an interferometric pattern. To collect these, this work combines the method of quantum imaging with undetected photons with digital phase-shifting holography with a minimal sampling of the interference. With this, the quantum imaging scheme gets extended in its measurement capabilities and brings it one step closer to application. Quantum imaging with undetected photons uses correlated photons generated by spontaneous parametric down-conversion in a non-linear interferometer to create indistinguishable photon pairs, which leads to an effect called induced coherence without induced emission. Placing an object inside changes the interferometric pattern depending on the object’s properties. Digital phase-shifting holography records multiple images of the interference with determined phase shifts to reconstruct the complete interference shape, which can afterward be used to analyze the changes introduced by the object and conclude its properties. An extensive characterization of this method was done using a proof-of-principle setup. The measured spatial resolution, phase accuracy, and transmission accuracy are compared for different combinations of camera exposure times and the number of interference sampling steps. The current limits of this method are shown to allow further improvements. To summarize, this work presents an alternative holographic measurement method using non-linear interferometers in combination with quantum imaging to enable new ways of measuring and motivating continuing research.

Keywords: digital holography, quantum imaging, quantum holography, quantum metrology

Procedia PDF Downloads 92
16489 Protective Effect of Rosemary Extract against Toxicity Induced by Egyptian Naja haje Venom

Authors: Walaa H. Salama, Azza M. Abdel-Aty, Afaf S. Fahmy

Abstract:

Background: Egyptian Cobra; Naja haje (Elapidae) is one of most common snakes, widely distributed in Egypt and its envenomation causes multi-organ failure leading to rapid death. Thus, Different medicinal plants showed a protective effect against venom toxicity and may complement the conventional antivenom therapy. Aim: The present study was designed to assess both the antioxidant capacity of methanolic extract of rosemary leaves and evaluate the neutralizing ability of the extract against hepatotoxicity induced by Naja haje venom. Methods: The total phenolic and flavonoid contents and the antioxidant capacity of the methanolic rosemary extract were estimated by DPPH and ABTS Scavenging methods. In addition, the rosemary extract were assessed for anti-venom properties under in vitro and in vivo standard assays. Results: The rosemary extract had high total phenolic and flavonoid content as 12 ± 2 g of gallic acid equivalent per 100 gram of dry weight (g GAE/100g dw) and 5.5 ± 0.8 g of catechin equivalent per 100 grams of dry weight (g CE/100g dw), respectively. In addition, the rosemary extract showed high antioxidant capacity. Furthermore, The rosemary extract were inhibited in vitro the enzymatic activities of phospholipase A₂, L-amino acid oxidase, and hyaluronidase of the venom in a dose-dependent manner. Moreover, indirect hemolytic activity, hepatotoxicity induced by venom were completely neutralized as shown by histological studies. Conclusion: The phenolic compounds of rosemary extract with potential antioxidant activity may be considered as a promising candidate for future therapeutics in snakebite therapy.

Keywords: antioxidant activity, neutralization, phospholipase A₂ enzyme, snake venom

Procedia PDF Downloads 182
16488 The Limits of the Effectiveness of Digital Advertising: Demonstration by the Economic Approach of Measuring Advertising Effectiveness

Authors: Barkaoui Asma

Abstract:

In our article, we use the economic approach of measuring advertising effectiveness to show the margin of advertising spread gained through digital communication. For economists, profit maximization depends on determining the optimal advertising budget. For this, they use the theories of the marginalist current to determine when the maximum level of benefits is reached. Using the economic approach we show the significant return on investment for advertisers. We then discuss the risks of perception of advertising pressure by consumers.

Keywords: digital advertising, economic approach, effectiveness, pressure

Procedia PDF Downloads 304
16487 A New Reliability Allocation Method Based on Fuzzy Numbers

Authors: Peng Li, Chuanri Li, Tao Li

Abstract:

Reliability allocation is quite important during early design and development stages for a system to apportion its specified reliability goal to subsystems. This paper improves the reliability fuzzy allocation method and gives concrete processes on determining the factor set, the factor weight set, judgment set, and multi-grade fuzzy comprehensive evaluation. To determine the weight of factor set, the modified trapezoidal numbers are proposed to reduce errors caused by subjective factors. To decrease the fuzziness in the fuzzy division, an approximation method based on linear programming is employed. To compute the explicit values of fuzzy numbers, centroid method of defuzzification is considered. An example is provided to illustrate the application of the proposed reliability allocation method based on fuzzy arithmetic.

Keywords: reliability allocation, fuzzy arithmetic, allocation weight, linear programming

Procedia PDF Downloads 342
16486 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada

Authors: Bilel Chalghaf, Mathieu Varin

Abstract:

Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.

Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR

Procedia PDF Downloads 134
16485 Effect of Withania Somnifera in Alloxan Induced Diabetic Rabbits

Authors: Farah Ali, Tehreem Fayyaz, Musadiq Idris

Abstract:

The present work was undertaken to investigate effects of various extracts of W. somniferafor anti-diabetic activity in alloxan induced diabetic rabbits. Rabbits were acclimatized for a week to standard laboratory temperature. Animals were fed according to a strict schedule (8 am, 3 pm and 10 pm) with green fodder (Medicago sativa) and tap water ad libitum. Animals were divided into nine groups of six rabbits each in a random manner. Body weights and physical activities of all rabbits were recorded before start of experiments. The animals of group 1 and 2 were given lactose (250 mg/kg,p.o) and Withaniasomniferaroot powder (100 mg/kg, p.o) respectively daily from day 1-20. Animals of group 3 were given alloxan (100 mg/kg,i.v) as a single dose on day 1. Powdered root of Withaniasomnifera in the doses of 100, 150, 200 mg/kg and its aqueous and ethanol extracts (equivalent to 200 mg/kg of crude drug) were given to the treated animals (groups 4-8), respectively by oral route for three weeks (day 1-20o.d), along with alloxan (100 mg/kg, i.v) as a single dose on day 1. Group 9 was treated with metformin (200 mg/kg, p.o) daily from day 1-20, along with a single dose of alloxan (100 mg/ kg, i.v) on day 1. Fasting serum glucose concentration in groups 3-9 was increased significantly (p<0.05) on day 3, with a maximum increase (215.3 mg/dl) in animals of toxic control (TC) group (3) on day 21 of the experiment as compared to normal control (NC) group (1). Effects of different doses (100, 150, 200 mg/kg, p.o) of W. somnifera root powder (WS) decreased the fasting serum glucose concentration as compared to toxic control group, with a maximum decrease (88.3 mg/dl) in group 2 (treated control) on day 21 of the experiment. Metformin (200 mg/kg, p.o) (reference control), aqueous extract (AWS) and ethanol extract (EWS) of W. somnifera (equivalent to 100 mg/kg W.somnifera root, p.o) antagonized the effects of alloxan as compared to toxic control group. These results indicate that the W. somnifera possess significant anti –diabetic activity.

Keywords: diabetes, serum, glucose, blood, sugar, rabbits

Procedia PDF Downloads 561
16484 Stability of Hybrid Systems

Authors: Kreangkri Ratchagit

Abstract:

This paper is concerned with exponential stability of switched linear systems with interval time-varying delays. The time delay is any continuous function belonging to a given interval, in which the lower bound of delay is not restricted to zero. By constructing a suitable augmented Lyapunov-Krasovskii functional combined with Leibniz-Newton’s formula, a switching rule for the exponential stability of switched linear systems with interval time-varying delays and new delay-dependent sufficient conditions for the exponential stability of the systems are first established in terms of LMIs. Finally, some examples are exploited to illustrate the effectiveness of the proposed schemes.

Keywords: exponential stability, hybrid systems, timevarying delays, Lyapunov-Krasovskii functional, Leibniz-Newton’s formula

Procedia PDF Downloads 458
16483 Deep Learning Based-Object-classes Semantic Classification of Arabic Texts

Authors: Imen Elleuch, Wael Ouarda, Gargouri Bilel

Abstract:

We proposes in this paper a Deep Learning based approach to classify text in order to enrich an Arabic ontology based on the objects classes of Gaston Gross. Those object classes are defined by taking into account the syntactic and semantic features of the treated language. Thus, our proposed approach is a hybrid one. In fact, it is based on the one hand on the object classes that represents a knowledge based-approach on classification of text and in the other hand it uses the deep learning approach that use the word embedding-based-approach to classify text. We have applied our proposed approach on a corpus constructed from an Arabic dictionary. The obtained semantic classification of text will enrich the Arabic objects classes ontology. In fact, new classes can be added to the ontology or an expansion of the features that characterizes each object class can be updated. The obtained results are compared to a similar work that treats the same object with a classical linguistic approach for the semantic classification of text. This comparison highlight our hybrid proposed approach that can be ameliorated by broaden the dataset used in the deep learning process.

Keywords: deep-learning approach, object-classes, semantic classification, Arabic

Procedia PDF Downloads 88
16482 Oxidosqualene Cyclase: A Novel Inhibitor

Authors: Devadrita Dey Sarkar

Abstract:

Oxidosqualene cyclase is a membrane bound enzyme in which helps in the formation of steroid scaffold in higher organisms. In a highly selective cyclization reaction oxidosqualene cyclase forms LANOSTEROL with seven chiral centres starting from the linear substrate 2,3-oxidosqualene. In humans OSC in cholesterol biosynthesis it represents a target for the discovery of novel anticholesteraemic drugs that could complement the widely used statins. The enzyme oxidosqualene: lanosterol cyclase (OSC) represents a novel target for the treatment of hypercholesterolemia. OSC catalyzes the cyclization of the linear 2,3-monoepoxysqualene to lanosterol, the initial four-ringed sterol intermediate in the cholesterol biosynthetic pathway. OSC also catalyzes the formation of 24(S), 25-epoxycholesterol, a ligand activator of the liver X receptor. Inhibition of OSC reduces cholesterol biosynthesis and selectively enhances 24(S),25-epoxycholesterol synthesis. Through this dual mechanism, OSC inhibition decreases plasma levels of low-density lipoprotein (LDL)-cholesterol and prevents cholesterol deposition within macrophages. The recent crystallization of OSC identifies the mechanism of action for this complex enzyme, setting the stage for the design of OSC inhibitors with improved pharmacological properties for cholesterol lowering and treatment of atherosclerosis. While studying and designing the inhibitor of oxidosqulene cyclase, I worked on the pdb id of 1w6k which was the most worked on pdb id and I used several methods, techniques and softwares to identify and validate the top most molecules which could be acting as an inhibitor for oxidosqualene cyclase. Thus, by partial blockage of this enzyme, both an inhibition of lanosterol and subsequently cholesterol formation as well as a concomitant effect on HMG-CoA reductase can be achieved. Both effects complement each other and lead to an effective control of cholesterol biosynthesis. It is therefore concluded that 2,3-oxidosqualene cyclase plays a crucial role in the regulation of intracellular cholesterol homeostasis. 2,3-Oxidosqualene cyclase inhibitors offer an attractive approach for novel lipid-lowering agents.

Keywords: anticholesteraemic, crystallization, statins, homeostasis

Procedia PDF Downloads 351
16481 A Continuous Boundary Value Method of Order 8 for Solving the General Second Order Multipoint Boundary Value Problems

Authors: T. A. Biala

Abstract:

This paper deals with the numerical integration of the general second order multipoint boundary value problems. This has been achieved by the development of a continuous linear multistep method (LMM). The continuous LMM is used to construct a main discrete method to be used with some initial and final methods (also obtained from the continuous LMM) so that they form a discrete analogue of the continuous second order boundary value problems. These methods are used as boundary value methods and adapted to cope with the integration of the general second order multipoint boundary value problems. The convergence, the use and the region of absolute stability of the methods are discussed. Several numerical examples are implemented to elucidate our solution process.

Keywords: linear multistep methods, boundary value methods, second order multipoint boundary value problems, convergence

Procedia PDF Downloads 377
16480 Innovative Screening Tool Based on Physical Properties of Blood

Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan

Abstract:

This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.

Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability

Procedia PDF Downloads 376
16479 Effect of Masonry Infill in R.C. Framed Buildings

Authors: Pallab Das, Nabam Zomleen

Abstract:

Effective dissipation of lateral loads that are coming due to seismic force determines the strength, durability and safety concern of the structure. Masonry infill has high stiffness and strength capabilities which can be put into an effective utilization for lateral load dissipation by incorporating it into building construction, but masonry behaves in highly nonlinear manner, so it is highly important to find out generalized, yet a rational approach to determine its nonlinear behavior and failure mode and it’s response when it is incorporated into building. But most of the countries do not specify the procedure for design of masonry infill wall. Whereas, there are many analytical modeling method available in literature, e.g. equivalent diagonal strut method, finite element modeling etc. In this paper the masonry infill is modeled and 6-storey bare framed building and building with masonry infill is analyzed using SAP-200014 in order to find out inter-storey drift by time-history analysis and capacity curve by Pushover analysis. The analysis shows that, while, the structure is well within CP performance level for both the case, whereas, there is considerable reduction of inter-storey drift of about 28%, when the building is analyzed with masonry infill wall.

Keywords: capacity curve, masonry infill, nonlinear analysis, time history analysis

Procedia PDF Downloads 383
16478 Review on Quaternion Gradient Operator with Marginal and Vector Approaches for Colour Edge Detection

Authors: Nadia Ben Youssef, Aicha Bouzid

Abstract:

Gradient estimation is one of the most fundamental tasks in the field of image processing in general, and more particularly for color images since that the research in color image gradient remains limited. The widely used gradient method is Di Zenzo’s gradient operator, which is based on the measure of squared local contrast of color images. The proposed gradient mechanism, presented in this paper, is based on the principle of the Di Zenzo’s approach using quaternion representation. This edge detector is compared to a marginal approach based on multiscale product of wavelet transform and another vector approach based on quaternion convolution and vector gradient approach. The experimental results indicate that the proposed color gradient operator outperforms marginal approach, however, it is less efficient then the second vector approach.

Keywords: gradient, edge detection, color image, quaternion

Procedia PDF Downloads 234