Search results for: exponential time differencing method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32335

Search results for: exponential time differencing method

30955 Mentor and Mentee Based Learning

Authors: Erhan Eroğlu

Abstract:

This paper presents a new method called Mentor and Mentee Based Learning. This new method is becoming more and more common especially at workplaces. This study is significant as it clearly underlines how it works well. Education has always aimed at equipping people with the necessary knowledge and information. For many decades it went on teachers’ talk and chalk methods. In the second half of the nineteenth century educators felt the need for some changes in delivery systems. Some new terms like self- discovery, learner engagement, student centered learning, hands on learning have become more and more popular for such a long time. However, some educators believe that there is much room for better learning methods in many fields as they think the learners still cannot fulfill their potential capacities. Thus, new systems and methods are still being developed and applied at education centers and work places. One of the latest methods is assigning some mentors for the newly recruited employees and training them within a mentor and mentee program which allows both parties to see their strengths and weaknesses and the areas which can be improved. This paper aims at finding out the perceptions of the mentors and mentees on the programs they are offered at their workplaces and suggests some betterment alternatives. The study has been conducted via a qualitative method whereby some interviews have been done with both mentors and mentees separately and together. Results show that it is a great way to train inexperienced one and also to refresh the older ones. Some points to be improved have also been underlined. The paper shows that education is not a one way path to follow.

Keywords: learning, mentor, mentee, training

Procedia PDF Downloads 225
30954 A Simple Autonomous Hovering and Operating Control of Multicopter Using Only Web Camera

Authors: Kazuya Sato, Toru Kasahara, Junji Kuroda

Abstract:

In this paper, an autonomous hovering control method of multicopter using only Web camera is proposed. Recently, various control method of an autonomous flight for multicopter are proposed. But, in the previously proposed methods, a motion capture system (i.e., OptiTrack) and laser range finder are often used to measure the position and posture of multicopter. To achieve an autonomous flight control of multicopter with simple equipment, we propose an autonomous flight control method using AR marker and Web camera. AR marker can measure the position of multicopter with Cartesian coordinate in three dimensional, then its position connects with aileron, elevator, and accelerator throttle operation. A simple PID control method is applied to the each operation and adjust the controller gains. Experimental result are given to show the effectiveness of our proposed method. Moreover, another simple operation method for autonomous flight control multicopter is also proposed.

Keywords: autonomous hovering control, multicopter, Web camera, operation

Procedia PDF Downloads 553
30953 Development of Latent Fingerprints on Non-Porous Surfaces Recovered from Fresh and Sea Water

Authors: A. Somaya Madkour, B. Abeer sheta, C. Fatma Badr El Dine, D. Yasser Elwakeel, E. Nermine AbdAllah

Abstract:

Criminal offenders have a fundamental goal not to leave any traces at the crime scene. Some may suppose that items recovered underwater will have no forensic value, therefore, they try to destroy the traces by throwing items in water. These traces are subjected to the destructive environmental effects. This can represent a challenge for Forensic experts investigating finger marks. Accordingly, the present study was conducted to determine the optimal method for latent fingerprints development on non-porous surfaces submerged in aquatic environments at different time interval. The two factors analyzed in this study were the nature of aquatic environment and length of submerged time. In addition, the quality of developed finger marks depending on the used method was also assessed. Therefore, latent fingerprints were deposited on metallic, plastic and glass objects and submerged in fresh or sea water for one, two, and ten days. After recovery, the items were subjected to cyanoacrylate fuming, black powder and small particle reagent processing and the prints were examined. Each print was evaluated according to fingerprint quality assessment scale. The present study demonstrated that the duration of submersion affects the quality of finger marks; the longer the duration, the worse the quality.The best results of visualization were achieved using cyanoacrylate either in fresh or sea water. This study has also revealed that the exposure to sea water had more destructive influence on the quality of detected finger marks.

Keywords: fingerprints, fresh water, sea, non-porous

Procedia PDF Downloads 449
30952 Retraction Free Motion Approach and Its Application in Automated Robotic Edge Finishing and Inspection Processes

Authors: M. Nemer, E. I. Konukseven

Abstract:

In this paper, a motion generation algorithm for a six Degrees of Freedom (DoF) robotic hand in a static environment is presented. The purpose of developing this method is to be used in the path generation of the end-effector for edge finishing and inspection processes by utilizing the CAD model of the considered workpiece. Nonetheless, the proposed algorithm may be extended to be applicable for other similar manufacturing processes. A software package programmed in the application programming interface (API) of SolidWorks generates tool path data for the robot. The proposed method significantly simplifies the given problem, resulting in a reduction in the CPU time needed to generate the path, and offers an efficient overall solution. The ABB IRB2000 robot is chosen for executing the generated tool path.

Keywords: CAD-based tools, edge deburring, edge scanning, offline programming, path generation

Procedia PDF Downloads 282
30951 Synthesis of 3,4-Dihydro-1H-Quinoxalin-2-Ones and 1H‑Quinolin-2-Ones and Evaluation of Their Anti-Bacterial Activity

Authors: Ali Amiri, Arash Esfandiari, Elham Zarenezhad

Abstract:

We report here an efficient and rapid method for the preparation of 3,4-dihydro-1H-quinoxalin-2-ones and 1H‑quinolin-2-ones that involves grinding of o-, m-, or p‑phenylenediamine and three dialkyl acetylenedicarboxylates using a pestle and mortar. This solvent-free approach requires only a few minutes of reaction time. This type of reaction is expected to be the most economical method since neither catalyst nor solvent is used. Finally, all synthesised compounds were screened for antimicrobial activity against two Gram-positive bacteria (Pseudomonas aeruginosa PTCC 1077, Escherichia coli PTCC1330) and two Gram-negative bacteria (Staphylococcus aureus PTCC 1133, Bacillus cereus PTCC 1015) and their activity. Compared with gentamycin and ampicillin as reference drugs for Gram-negative and Gram-positive bacteria, respectively. The minimum inhibitory concentration (MIC) of the synthesised compounds and reference drugs were determined by the microdilution method. Good antibacterial activity was observed for 3,4-dihydro-1H-quinoxalin-2-ones against all species of Gram-positive and Gram-negative bacteria, and1H‑quinolin-2-ones showed good antibacterial activity against two Gram-positive bacteria.

Keywords: quinolin, quinoxalin, anti-bacterial activity, minimum inhibitory concentration (MIC)

Procedia PDF Downloads 328
30950 Passenger Flow Characteristics of Seoul Metropolitan Subway Network

Authors: Kang Won Lee, Jung Won Lee

Abstract:

Characterizing the network flow is of fundamental importance to understand the complex dynamics of networks. And passenger flow characteristics of the subway network are very relevant for an effective transportation management in urban cities. In this study, passenger flow of Seoul metropolitan subway network is investigated and characterized through statistical analysis. Traditional betweenness centrality measure considers only topological structure of the network and ignores the transportation factors. This paper proposes a weighted betweenness centrality measure that incorporates monthly passenger flow volume. We apply the proposed measure on the Seoul metropolitan subway network involving 493 stations and 16 lines. Several interesting insights about the network are derived from the new measures. Using Kolmogorov-Smirnov test, we also find out that monthly passenger flow between any two stations follows a power-law distribution and other traffic characteristics such as congestion level and throughflow traffic follow exponential distribution.

Keywords: betweenness centrality, correlation coefficient, power-law distribution, Korea traffic DB

Procedia PDF Downloads 285
30949 Aliasing Free and Additive Error in Spectra for Alpha Stable Signals

Authors: R. Sabre

Abstract:

This work focuses on the symmetric alpha stable process with continuous time frequently used in modeling the signal with indefinitely growing variance, often observed with an unknown additive error. The objective of this paper is to estimate this error from discrete observations of the signal. For that, we propose a method based on the smoothing of the observations via Jackson polynomial kernel and taking into account the width of the interval where the spectral density is non-zero. This technique allows avoiding the “Aliasing phenomenon” encountered when the estimation is made from the discrete observations of a process with continuous time. We have studied the convergence rate of the estimator and have shown that the convergence rate improves in the case where the spectral density is zero at the origin. Thus, we set up an estimator of the additive error that can be subtracted for approaching the original signal without error.

Keywords: spectral density, stable processes, aliasing, non parametric

Procedia PDF Downloads 124
30948 Application of Double Side Approach Method on Super Elliptical Winkler Plate

Authors: Hsiang-Wen Tang, Cheng-Ying Lo

Abstract:

In this study, the static behavior of super elliptical Winkler plate is analyzed by applying the double side approach method. The lack of information about super elliptical Winkler plates is the motivation of this study and we use the double side approach method to solve this problem because of its superior ability on efficiently treating problems with complex boundary shape. The double side approach method has the advantages of high accuracy, easy calculation procedure and less calculation load required. Most important of all, it can give the error bound of the approximate solution. The numerical results not only show that the double side approach method works well on this problem but also provide us the knowledge of static behavior of super elliptical Winkler plate in practical use.

Keywords: super elliptical winkler plate, double side approach method, error bound, mechanic

Procedia PDF Downloads 347
30947 Comparison Between a Droplet Digital PCR and Real Time PCR Method in Quantification of HBV DNA

Authors: Surangrat Srisurapanon, Chatchawal Wongjitrat, Navin Horthongkham, Ruengpung Sutthent

Abstract:

HBV infection causes a potential serious public health problem. The ability to detect the HBV DNA concentration is of the importance and improved continuously. By using quantitative Polymerase Chain Reaction (qPCR), several factors in standardized; source of material, calibration standard curve and PCR efficiency are inconsistent. Digital PCR (dPCR) is an alternative PCR-based technique for absolute quantification using Poisson's statistics without requiring a standard curve. Therefore, the aim of this study is to compare the data set of HBV DNA generated between dPCR and qPCR methods. All samples were quantified by Abbott’s real time PCR and 54 samples with 2 -6 log10 HBV DNA were selected for comparison with dPCR. Of these 54 samples, there were two outlier samples defined as negative by dPCR. Of these two, samples were defined as negative by dPCR, whereas 52 samples were positive by both the tests. The difference between the two assays was less than 0.25 log IU/mL in 24/52 samples (46%) of paired samples; less than 0.5 log IU/mL in 46/52 samples (88%) and less than 1 log in 50/52 samples (96%). The correlation coefficient was r=0.788 and P-value <0.0001. Comparison to qPCR, data generated by dPCR tend to be the overestimation in the sample with low HBV DNA concentration and underestimated in the sample with high viral load. The variation in DNA by dPCR measurement might be due to the pre-amplification bias, template. Moreover, a minor drawback of dPCR is the large quantity of DNA had to be used when compare to the qPCR. Since the technology is relatively new, the limitations of this assay will be improved.

Keywords: hepatitis B virus, real time PCR, digital PCR, DNA quantification

Procedia PDF Downloads 477
30946 Study of Hot Press Molding Method of Biodegradable Composite, Polypropylene Reinforced Coconut Coir

Authors: Herman Ruswan Suwarman, Ahmad Rivai, Mochamad Saidiman, Kuncoro Diharjo, Dody Ariawan

Abstract:

The use of biodegradable composite to solve ecological and environmental problems has currently risen as a trend. With the increasing use of biodegradable composite comes an increasing need to fabricate it properly. Yet this understanding has remained a challenge for the design engineer. Therefore, this study aims to explore how to combine coconut coir as a reinforcing material and polypropylene (PP) as a biodegradable polymer matrix. By using Hotpress Molding, two methods were developed and compared. The difference between these two methods is not only the step of fabrication but also the raw material. The first method involved a PP sheet and the second used PP pellets directly. Based on the results, it can be concluded that PP pellets yield better results, where the composite was produced in a shorter time, with an evenly distributed coconut coir and a smaller number of voids.

Keywords: biodegradable, coconut coir, hot press molding, polypropylene

Procedia PDF Downloads 139
30945 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems

Authors: Belkacem Laimouche

Abstract:

With the field of artificial intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.

Keywords: artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, interlaboratory comparison, data analysis, data reliability, measurement of bias impact on predictions, improvement of model accuracy and reliability

Procedia PDF Downloads 101
30944 Identifying Unknown Dynamic Forces Applied on Two Dimensional Frames

Authors: H. Katkhuda

Abstract:

A time domain approach is used in this paper to identify unknown dynamic forces applied on two dimensional frames using the measured dynamic structural responses for a sub-structure in the two dimensional frame. In this paper a sub-structure finite element model with short length of measurement from only three or four accelerometers is required, and an iterative least-square algorithm is used to identify the unknown dynamic force applied on the structure. Validity of the method is demonstrated with numerical examples using noise-free and noise-contaminated structural responses. Both harmonic and impulsive forces are studied. The results show that the proposed approach can identify unknown dynamic forces within very limited iterations with high accuracy and shows its robustness even noise- polluted dynamic response measurements are utilized.

Keywords: dynamic force identification, dynamic responses, sub-structure, time domain

Procedia PDF Downloads 351
30943 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 91
30942 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 418
30941 Barriers towards Effective Participation in Physically Oriented Leisure Time Activities: A Case Study of Federal College of Education, Pankshin Plateau State, Nigeria

Authors: Mulak Moses Yokdi

Abstract:

Correct use of leisure time has suffered neglect in our society and the people ignorantly think that the trend does not matter. The researcher felt concerned about the issue and went on to find out why using FCE, Pankshin workers as a case study. Four hypotheses were used, considering such variables as leadership, traditional activities, stress due to work pressure and time constraint. The participants selected for the study were one hundred and ten members of FCE, Pankshin staff. A self-developed questionnaire was the instrument used. Chi-square (x2) was employed to test the hypotheses at P = 0.005; df = 3. The statistics of percentages was also used to describe the situation as implicated by the data. The results showed that all hypotheses were significant (P = 0.05). It was concluded that the four variables were impediments to effective participation in physically oriented leisure time activities among the FCE, Staff. Based on the findings, it was recommended that the FCE should get good leadership, create good awareness for people to understand why they should be effectively involved in physically oriented leisure time activities.

Keywords: barriers, effective participation, leisure time, physically oriented, work pressure, time constraint

Procedia PDF Downloads 359
30940 Study the Dynamic Behavior of Irregular Buildings by the Analysis Method Accelerogram

Authors: Beciri Mohamed Walid

Abstract:

Some architectural conditions required some shapes often lead to an irregular distribution of masses, rigidities and resistances. The main object of the present study consists in estimating the influence of the irregularity both in plan and in elevation which presenting some structures on the dynamic characteristics and his influence on the behavior of this structures. To do this, it is necessary to make apply both dynamic methods proposed by the RPA99 (spectral modal method and method of analysis by accelerogram) on certain similar prototypes and to analyze the parameters measuring the answer of these structures and to proceed to a comparison of the results.

Keywords: structure, irregular, code, seismic, method, force, period

Procedia PDF Downloads 304
30939 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 400
30938 Solvent Extraction, Spectrophotometric Determination of Antimony(III) from Real Samples and Synthetic Mixtures Using O-Methylphenyl Thiourea as a Sensitive Reagent

Authors: Shashikant R. Kuchekar, Shivaji D. Pulate, Vishwas B. Gaikwad

Abstract:

A simple and selective method is developed for solvent extraction spectrophotometric determination of antimony(III) using O-Methylphenyl Thiourea (OMPT) as a sensitive chromogenic chelating agent. The basis of proposed method is formation of antimony(III)-OMPT complex was extracted with 0.0025 M OMPT in chloroform from aqueous solution of antimony(III) in 1.0 M perchloric acid. The absorbance of this complex was measured at 297 nm against reagent blank. Beer’s law was obeyed up to 15µg mL-1 of antimony(III). The Molar absorptivity and Sandell’s sensitivity of the antimony(III)-OMPT complex in chloroform are 16.6730 × 103 L mol-1 cm-1 and 0.00730282 µg cm-2 respectively. The stoichiometry of antimony(III)-OMPT complex was established from slope ratio method, mole ratio method and Job’s continuous variation method was 1:2. The complex was stable for more than 48 h. The interfering effect of various foreign ions was studied and suitable masking agents are used wherever necessary to enhance selectivity of the method. The proposed method is successfully applied for determination of antimony(III) from real samples alloy and synthetic mixtures. Repetition of the method was checked by finding relative standard deviation (RSD) for 10 determinations which was 0.42%.

Keywords: solvent extraction, antimony, spectrophotometry, real sample analysis

Procedia PDF Downloads 329
30937 Determination of MDA by HPLC in Blood of Levofloxacin Treated Rats

Authors: D. S. Mohale, A. P. Dewani, A. S.tripathi, A. V. Chandewar

Abstract:

Present work demonstrates the applicability of high-performance liquid chromatography (HPLC) with UV-Vis detection for the quantification of malondialdehyde as malondialdehyde-thiobarbituric acid complex (MDA-TBA) in-vivo in rats. The HPLC method for MDA-TBA was achieved by isocratic mode on a reverse-phase C18 column (250mm×4.6mm) at a flow rate of 1.0mLmin−1 followed by detection at 532 nm. The chromatographic conditions were optimized by varying the concentration and pH of water followed by changes in percentage of organic phase optimal mobile phase consisted of mixture of water (0.2% triethylamine pH adjusted to 2.3 by ortho-phosphoric acid) and acetonitrile in ratio (80:20v/v). The retention time of MDA-TBA complex was 3.7 min. The developed method was sensitive as limit of detection and quantification (LOD and LOQ) for MDA-TBA complex were (standard deviation and slope of calibration curve) 110 ng/ml and 363 ng/ml respectively. Calibration studies were done by spiking MDA into rat plasma at concentrations ranging from 500 to 1000 ng/ml. The precision of developed method measured in terms of relative standard deviations for intra-day and inter-day studies was 1.6–5.0% and 1.9–3.6% respectively. The HPLC method was applied for monitoring MDA levels in rats subjected to chronic treatment of levofloxacin (LEV) (5mg/kg/day) for 21 days. Results were compared by findings in control group rats. Mean peak areas of both study groups was subjected for statistical treatment to unpaired student t-test to find p-values. The p value was <0.001 indicating significant results and suggesting increased MDA levels in rats subjected to chronic treatment of LEV of 21 days.

Keywords: malondialdehyde-thiobarbituric acid complex, levofloxacin, HPLC, oxidative stress

Procedia PDF Downloads 328
30936 Cognitive Behaviour Drama: Playful Method to Address Fears in Children on the Higher-End of the Autism Spectrum

Authors: H.Karnezi, K. Tierney

Abstract:

Childhood fears that persist over time and interfere with the children’s normal functioning may have detrimental effects on their social and emotional development. Cognitive behavior therapy is considered highly effective in treating fears and anxieties. However, given that many childhood fears are based on fantasy, the applicability of CBT may be hindered by cognitive immaturity. Furthermore, a lack of motivation to engage in therapy is another commonly encountered obstacle. The purpose of this study was to introduce and evaluate a more developmentally appropriate intervention model, specifically designed to provide phobic children with the motivation to overcome their fears. To this end, principles and techniques from cognitive and behavior therapies are incorporated into the ‘Drama in Education’ model. The Cognitive Behaviour Drama (CBD) method involves using the phobic children’s creativity to involve them in the therapeutic process. The children are invited to engage in exciting fictional scenarios tailored around their strengths and special interests. Once their commitment to the drama is established, a problem that they will feel motivated to solve is introduced. To resolve it, the children will have to overcome a number of obstacles culminating in an in vivo confrontation with the fear stimulus. The study examined the application of the CBD model in three single cases. Results in all three cases shown complete elimination of all fear-related symptoms. Preliminary results justify further evaluation of the Cognitive Behaviour Drama model. It is time and cost-effective, ensuring the clients' immediate engagement in the therapeutic process.

Keywords: phobias, autism, intervention, drama

Procedia PDF Downloads 121
30935 A Fuzzy Satisfactory Optimization Method Based on Stress Analysis for a Hybrid Composite Flywheel

Authors: Liping Yang, Curran Crawford, Jr. Ren, Zhengyi Ren

Abstract:

Considering the cost evaluation and the stress analysis, a fuzzy satisfactory optimization (FSO) method has been developed for a hybrid composite flywheel. To evaluate the cost, the cost coefficients of the flywheel components are obtained through calculating the weighted sum of the scores of the material manufacturability, the structure character, and the material price. To express the satisfactory degree of the energy, the cost, and the mass, the satisfactory functions are proposed by using the decline function and introducing a satisfactory coefficient. To imply the different significance of the objectives, the object weight coefficients are defined. Based on the stress analysis of composite material, the circumferential and radial stresses are considered into the optimization formulation. The simulations of the FSO method with different weight coefficients and storage energy density optimization (SEDO) method of a flywheel are contrasted. The analysis results show that the FSO method can satisfy different requirements of the designer and the FSO method with suitable weight coefficients can replace the SEDO method.

Keywords: flywheel energy storage, fuzzy, optimization, stress analysis

Procedia PDF Downloads 338
30934 The Integrated Methodological Development of Reliability, Risk and Condition-Based Maintenance in the Improvement of the Thermal Power Plant Availability

Authors: Henry Pariaman, Iwa Garniwa, Isti Surjandari, Bambang Sugiarto

Abstract:

Availability of a complex system of thermal power plant is strongly influenced by the reliability of spare parts and maintenance management policies. A reliability-centered maintenance (RCM) technique is an established method of analysis and is the main reference for maintenance planning. This method considers the consequences of failure in its implementation, but does not deal with further risk of down time that associated with failures, loss of production or high maintenance costs. Risk-based maintenance (RBM) technique provides support strategies to minimize the risks posed by the failure to obtain maintenance task considering cost effectiveness. Meanwhile, condition-based maintenance (CBM) focuses on monitoring the application of the conditions that allow the planning and scheduling of maintenance or other action should be taken to avoid the risk of failure prior to the time-based maintenance. Implementation of RCM, RBM, CBM alone or combined RCM and RBM or RCM and CBM is a maintenance technique used in thermal power plants. Implementation of these three techniques in an integrated maintenance will increase the availability of thermal power plants compared to the use of maintenance techniques individually or in combination of two techniques. This study uses the reliability, risks and conditions-based maintenance in an integrated manner to increase the availability of thermal power plants. The method generates MPI (Priority Maintenance Index) is RPN (Risk Priority Number) are multiplied by RI (Risk Index) and FDT (Failure Defense Task) which can generate the task of monitoring and assessment of conditions other than maintenance tasks. Both MPI and FDT obtained from development of functional tree, failure mode effects analysis, fault-tree analysis, and risk analysis (risk assessment and risk evaluation) were then used to develop and implement a plan and schedule maintenance, monitoring and assessment of the condition and ultimately perform availability analysis. The results of this study indicate that the reliability, risks and conditions-based maintenance methods, in an integrated manner can increase the availability of thermal power plants.

Keywords: integrated maintenance techniques, availability, thermal power plant, MPI, FDT

Procedia PDF Downloads 787
30933 A Dynamical Study of Fractional Order Obesity Model by a Combined Legendre Wavelet Method

Authors: Hakiki Kheira, Belhamiti Omar

Abstract:

In this paper, we propose a new compartmental fractional order model for the simulation of epidemic obesity dynamics. Using the Legendre wavelet method combined with the decoupling and quasi-linearization technique, we demonstrate the validity and applicability of our model. We also present some fractional differential illustrative examples to demonstrate the applicability and efficiency of the method. The fractional derivative is described in the Caputo sense.

Keywords: Caputo derivative, epidemiology, Legendre wavelet method, obesity

Procedia PDF Downloads 413
30932 Microwave Assisted Growth of Varied Phases and Morphologies of Vanadium Oxides Nanostructures: Structural and Optoelectronic Properties

Authors: Issam Derkaoui, Mohammed Khenfouch, Bakang M. Mothudi, Malik Maaza, Izeddine Zorkani, Anouar Jorio

Abstract:

Transition metal oxides nanoparticles with different morphologies have attracted a lot of attention recently owning to their distinctive geometries, and demonstrated promising electrical properties for various applications. In this paper, we discuss the time and annealing effects on the structural and electrical properties of vanadium oxides nanoparticles (VO-NPs) prepared by microwave method. In this sense, transmission electron microscopy (TEM), X-ray diffraction (XRD), Raman Spectroscopy, Ultraviolet-visible absorbance spectra (Uv-Vis) and electrical conductivity were investigated. Hence, the annealing state and the time are two crucial parameters for the improvement of the optoelectronic properties. The use of these nanostructures is promising way for the development of technological applications especially for energy storage devices.

Keywords: Vanadium oxide, Microwave, Electrical conductivity, Optoelectronic properties

Procedia PDF Downloads 187
30931 Exploiting the Potential of Fabric Phase Sorptive Extraction for Forensic Food Safety: Analysis of Food Samples in Cases of Drug Facilitated Crimes

Authors: Bharti Jain, Rajeev Jain, Abuzar Kabir, Torki Zughaibi, Shweta Sharma

Abstract:

Drug-facilitated crimes (DFCs) entail the use of a single drug or a mixture of drugs to render a victim unable. Traditionally, biological samples have been gathered from victims and conducted analysis to establish evidence of drug administration. Nevertheless, the rapid metabolism of various drugs and delays in analysis can impede the identification of such substances. For this, the present article describes a rapid, sustainable, highly efficient and miniaturized protocol for the identification and quantification of three sedative-hypnotic drugs, namely diazepam, chlordiazepoxide and ketamine in alcoholic beverages and complex food samples (cream of biscuit, flavored milk, juice, cake, tea, sweets and chocolate). The methodology involves utilizing fabric phase sorptive extraction (FPSE) to extract diazepam (DZ), chlordiazepoxide (CDP), and ketamine (KET). Subsequently, the extracted samples are subjected to analysis using gas chromatography-mass spectrometry (GC-MS). Several parameters, including the type of membrane, pH, agitation time and speed, ionic strength, sample volume, elution volume and time, and type of elution solvent, were screened and thoroughly optimized. Sol-gel Carbowax 20M (CW-20M) has demonstrated the most effective extraction efficiency for the target analytes among all evaluated membranes. Under optimal conditions, the method displayed linearity within the range of 0.3–10 µg mL–¹ (or µg g–¹), exhibiting a coefficient of determination (R2) ranging from 0.996–0.999. The limits of detection (LODs) and limits of quantification (LOQs) for liquid samples range between 0.020-0.069 µg mL-¹ and 0.066-0.22 µg mL-¹, respectively. Correspondingly, the LODs for solid samples ranged from 0.056-0.090 µg g-¹, while the LOQs ranged from 0.18-0.29 µg g-¹. Notably, the method showcased better precision, with repeatability and reproducibility both below 5% and 10%, respectively. Furthermore, the FPSE-GC-MS method proved effective in determining diazepam (DZ) in forensic food samples connected to drug-facilitated crimes (DFCs). Additionally, the proposed method underwent evaluation for its whiteness using the RGB12 algorithm.

Keywords: drug facilitated crime, fabric phase sorptive extraction, food forensics, white analytical chemistry

Procedia PDF Downloads 60
30930 Singular Perturbed Vector Field Method Applied to the Problem of Thermal Explosion of Polydisperse Fuel Spray

Authors: Ophir Nave

Abstract:

In our research, we present the concept of singularly perturbed vector field (SPVF) method, and its application to thermal explosion of diesel spray combustion. Given a system of governing equations, which consist of hidden Multi-scale variables, the SPVF method transfer and decompose such system to fast and slow singularly perturbed subsystems (SPS). The SPVF method enables us to understand the complex system, and simplify the calculations. Later powerful analytical, numerical and asymptotic methods (e.g method of integral (invariant) manifold (MIM), the homotopy analysis method (HAM) etc.) can be applied to each subsystem. We compare the results obtained by the methods of integral invariant manifold and SPVF apply to spray droplets combustion model. The research deals with the development of an innovative method for extracting fast and slow variables in physical mathematical models. The method that we developed called singular perturbed vector field. This method based on a numerical algorithm applied to global quasi linearization applied to given physical model. The SPVF method applied successfully to combustion processes. Our results were compared to experimentally results. The SPVF is a general numerical and asymptotical method that reveals the hierarchy (multi-scale system) of a given system.

Keywords: polydisperse spray, model reduction, asymptotic analysis, multi-scale systems

Procedia PDF Downloads 213
30929 An Optimization Model for Maximum Clique Problem Based on Semidefinite Programming

Authors: Derkaoui Orkia, Lehireche Ahmed

Abstract:

The topic of this article is to exploring the potentialities of a powerful optimization technique, namely Semidefinite Programming, for solving NP-hard problems. This approach provides tight relaxations of combinatorial and quadratic problems. In this work, we solve the maximum clique problem using this relaxation. The clique problem is the computational problem of finding cliques in a graph. It is widely acknowledged for its many applications in real-world problems. The numerical results show that it is possible to find a maximum clique in polynomial time, using an algorithm based on semidefinite programming. We implement a primal-dual interior points algorithm to solve this problem based on semidefinite programming. The semidefinite relaxation of this problem can be solved in polynomial time.

Keywords: semidefinite programming, maximum clique problem, primal-dual interior point method, relaxation

Procedia PDF Downloads 215
30928 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data

Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.

Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry

Procedia PDF Downloads 214
30927 The Usage of Bridge Estimator for Hegy Seasonal Unit Root Tests

Authors: Huseyin Guler, Cigdem Kosar

Abstract:

The aim of this study is to propose Bridge estimator for seasonal unit root tests. Seasonality is an important factor for many economic time series. Some variables may contain seasonal patterns and forecasts that ignore important seasonal patterns have a high variance. Therefore, it is very important to eliminate seasonality for seasonal macroeconomic data. There are some methods to eliminate the impacts of seasonality in time series. One of them is filtering the data. However, this method leads to undesired consequences in unit root tests, especially if the data is generated by a stochastic seasonal process. Another method to eliminate seasonality is using seasonal dummy variables. Some seasonal patterns may result from stationary seasonal processes, which are modelled using seasonal dummies but if there is a varying and changing seasonal pattern over time, so the seasonal process is non-stationary, deterministic seasonal dummies are inadequate to capture the seasonal process. It is not suitable to use seasonal dummies for modeling such seasonally nonstationary series. Instead of that, it is necessary to take seasonal difference if there are seasonal unit roots in the series. Different alternative methods are proposed in the literature to test seasonal unit roots, such as Dickey, Hazsa, Fuller (DHF) and Hylleberg, Engle, Granger, Yoo (HEGY) tests. HEGY test can be also used to test the seasonal unit root in different frequencies (monthly, quarterly, and semiannual). Another issue in unit root tests is the lag selection. Lagged dependent variables are added to the model in seasonal unit root tests as in the unit root tests to overcome the autocorrelation problem. In this case, it is necessary to choose the lag length and determine any deterministic components (i.e., a constant and trend) first, and then use the proper model to test for seasonal unit roots. However, this two-step procedure might lead size distortions and lack of power in seasonal unit root tests. Recent studies show that Bridge estimators are good in selecting optimal lag length while differentiating nonstationary versus stationary models for nonseasonal data. The advantage of this estimator is the elimination of the two-step nature of conventional unit root tests and this leads a gain in size and power. In this paper, the Bridge estimator is proposed to test seasonal unit roots in a HEGY model. A Monte-Carlo experiment is done to determine the efficiency of this approach and compare the size and power of this method with HEGY test. Since Bridge estimator performs well in model selection, our approach may lead to some gain in terms of size and power over HEGY test.

Keywords: bridge estimators, HEGY test, model selection, seasonal unit root

Procedia PDF Downloads 332
30926 Phenols and Manganese Removal from Landfill Leachate and Municipal Waste Water Using the Constructed Wetland

Authors: Amin Mojiri, Lou Ziyang

Abstract:

Constructed wetland (CW) is a reasonable method to treat waste water. Current study was carried out to co-treat landfill leachate and domestic waste water using a CW system. Typha domingensis was transplanted to CW, which encloses two substrate layers of adsorbents named ZELIAC and zeolite. Response surface methodology and central composite design were employed to evaluate experimental data. Contact time (h) and leachate to waste water mixing ratio (%; v/v) were selected as independent factors. Phenols and manganese removal were selected as dependent responses. At optimum contact time (48.7 h) and leachate to waste water mixing ratio (20.0%), removal efficiencies of phenols and manganese removal efficiencies were 90.5%, and 89.4%, respectively.

Keywords: constructed wetland, Manganese, phenols, Thypha domingensis

Procedia PDF Downloads 317