Search results for: stochastic averaging method
17119 Measurement of Convective Heat Transfer from a Vertical Flat Plate Using Mach-Zehnder Interferometer with Wedge Fringe Setting
Authors: Divya Haridas, C. B. Sobhan
Abstract:
Laser interferometric methods have been utilized for the measurement of natural convection heat transfer from a heated vertical flat plate, in the investigation presented here. The study mainly aims at comparing two different fringe orientations in the wedge fringe setting of Mach-Zehnder interferometer (MZI), used for the measurements. The interference fringes are set in horizontal and vertical orientations with respect to the heated surface, and two different fringe analysis methods, namely the stepping method and the method proposed by Naylor and Duarte, are used to obtain the heat transfer coefficients. The experimental system is benchmarked with theoretical results, thus validating its reliability in heat transfer measurements. The interference fringe patterns are analyzed digitally using MATLAB 7 and MOTIC Plus softwares, which ensure improved efficiency in fringe analysis, hence reducing the errors associated with conventional fringe tracing. The work also discuss the relative merits and limitations of the two methods used.Keywords: Mach-Zehnder interferometer (MZI), natural convection, Naylor method, Vertical Flat Plate
Procedia PDF Downloads 36417118 Explicit Iterative Scheme for Approximating a Common Solution of Generalized Mixed Equilibrium Problem and Fixed Point Problem for a Nonexpansive Semigroup in Hilbert Space
Authors: Mohammad Farid
Abstract:
In this paper, we introduce and study an explicit iterative method based on hybrid extragradient method to approximate a common solution of generalized mixed equilibrium problem and fixed point problem for a nonexpansive semigroup in Hilbert space. Further, we prove that the sequence generated by the proposed iterative scheme converge strongly to the common solution of generalized mixed equilibrium problem and fixed point problem for a nonexpansive semigroup. This common solution is the unique solution of a variational inequality problem and is the optimality condition for a minimization problem. The results presented in this paper are the supplement, extension and generalization of the previously known results in this area.Keywords: generalized mixed equilibrium problem, fixed-point problem, nonexpansive semigroup, variational inequality problem, iterative algorithms, hybrid extragradient method
Procedia PDF Downloads 47517117 Strabismus Detection Using Eye Alignment Stability
Authors: Anoop T. R., Otman Basir, Robert F. Hess, Ben Thompson
Abstract:
Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. Currently, many children with strabismus remain undiagnosed until school entry because current automated screening methods have limited success in the preschool age range. A method for strabismus detection using eye alignment stability (EAS) is proposed. This method starts with face detection, followed by facial landmark detection, eye region segmentation, eye gaze extraction, and eye alignment stability estimation. Binarization and morphological operations are performed for segmenting the pupil region from the eye. After finding the EAS, its absolute value is used to differentiate the strabismic eye from the non-strabismic eye. If the value of the eye alignment stability is greater than a particular threshold, then the eyes are misaligned, and if its value is less than the threshold, the eyes are aligned. The method was tested on 175 strabismic and non-strabismic images obtained from Kaggle and Google Photos. The strabismic eye is taken as a positive class, and the non-strabismic eye is taken as a negative class. The test produced a true positive rate of 100% and a false positive rate of 7.69%.Keywords: strabismus, face detection, facial landmarks, eye segmentation, eye gaze, binarization
Procedia PDF Downloads 7617116 Orbit Determination from Two Position Vectors Using Finite Difference Method
Authors: Akhilesh Kumar, Sathyanarayan G., Nirmala S.
Abstract:
An unusual approach is developed to determine the orbit of satellites/space objects. The determination of orbits is considered a boundary value problem and has been solved using the finite difference method (FDM). Only positions of the satellites/space objects are known at two end times taken as boundary conditions. The technique of finite difference has been used to calculate the orbit between end times. In this approach, the governing equation is defined as the satellite's equation of motion with a perturbed acceleration. Using the finite difference method, the governing equations and boundary conditions are discretized. The resulting system of algebraic equations is solved using Tri Diagonal Matrix Algorithm (TDMA) until convergence is achieved. This methodology test and evaluation has been done using all GPS satellite orbits from National Geospatial-Intelligence Agency (NGA) precise product for Doy 125, 2023. Towards this, two hours of twelve sets have been taken into consideration. Only positions at the end times of each twelve sets are considered boundary conditions. This algorithm is applied to all GPS satellites. Results achieved using FDM compared with the results of NGA precise orbits. The maximum RSS error for the position is 0.48 [m] and the velocity is 0.43 [mm/sec]. Also, the present algorithm is applied on the IRNSS satellites for Doy 220, 2023. The maximum RSS error for the position is 0.49 [m], and for velocity is 0.28 [mm/sec]. Next, a simulation has been done for a Highly Elliptical orbit for DOY 63, 2023, for the duration of 6 hours. The RSS of difference in position is 0.92 [m] and velocity is 1.58 [mm/sec] for the orbital speed of more than 5km/sec. Whereas the RSS of difference in position is 0.13 [m] and velocity is 0.12 [mm/sec] for the orbital speed less than 5km/sec. Results show that the newly created method is reliable and accurate. Further applications of the developed methodology include missile and spacecraft targeting, orbit design (mission planning), space rendezvous and interception, space debris correlation, and navigation solutions.Keywords: finite difference method, grid generation, NavIC system, orbit perturbation
Procedia PDF Downloads 8417115 Arithmetic Operations Based on Double Base Number Systems
Authors: K. Sanjayani, C. Saraswathy, S. Sreenivasan, S. Sudhahar, D. Suganya, K. S. Neelukumari, N. Vijayarangan
Abstract:
Double Base Number System (DBNS) is an imminent system of representing a number using two bases namely 2 and 3, which has its application in Elliptic Curve Cryptography (ECC) and Digital Signature Algorithm (DSA).The previous binary method representation included only base 2. DBNS uses an approximation algorithm namely, Greedy Algorithm. By using this algorithm, the number of digits required to represent a larger number is less when compared to the standard binary method that uses base 2 algorithms. Hence, the computational speed is increased and time being reduced. The standard binary method uses binary digits 0 and 1 to represent a number whereas the DBNS method uses binary digit 1 alone to represent any number (canonical form). The greedy algorithm uses two ways to represent the number, one is by using only the positive summands and the other is by using both positive and negative summands. In this paper, arithmetic operations are used for elliptic curve cryptography. Elliptic curve discrete logarithm problem is the foundation for most of the day to day elliptic curve cryptography. This appears to be a momentous hard slog compared to digital logarithm problem. In elliptic curve digital signature algorithm, the key generation requires 160 bit of data by usage of standard binary representation. Whereas, the number of bits required generating the key can be reduced with the help of double base number representation. In this paper, a new technique is proposed to generate key during encryption and extraction of key in decryption.Keywords: cryptography, double base number system, elliptic curve cryptography, elliptic curve digital signature algorithm
Procedia PDF Downloads 39617114 Determination and Evaluation of the Need of Land Consolidation for Nationalization Purpose with the Survey Results
Authors: Turgut Ayten, Tayfun Çay, Demet Ayten
Abstract:
In this research, nationalization method for obtaining land on the destination of Ankara-Konya High Speed Train in Turkey; Land consolidation for nationalization purpose as an alternative solution on obtaining land; a survey prepared for land owners whose lands were nationalized and institution officials who carries out the nationalization and land consolidation was applied, were investigated and the need for land consolidation for nationalization purpose is tried to be put forth. Study area is located in the Konya city- Kadınhanı district-Kolukısa and Sarikaya neighbourhood in Turkey and land consolidation results of the selected field which is on the destination of the high-speed train route were obtained. The data obtained was shared with the landowners in the research area, their choice between the nationalization method and land consolidation for nationalization method was questioned. In addition, the organization and institution officials who are accepted to used primarily by the state for obtaining land that are needed for the investments of state, and institution officials who make land consolidation were investigated on the issues of the efficiency of the methods they used and if they tried different methods.Keywords: nationalization, land consolidation, land consolidation for nationalization
Procedia PDF Downloads 32417113 Analysis of Histogram Asymmetry for Waste Recognition
Authors: Janusz Bobulski, Kamila Pasternak
Abstract:
Despite many years of effort and research, the problem of waste management is still current. So far, no fully effective waste management system has been developed. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.Keywords: waste management, environmental protection, image processing, computer vision
Procedia PDF Downloads 11917112 The Transport of Radical Species to Single and Double Strand Breaks in the Liver’s DNA Molecule by a Hybrid Method of Type Monte Carlo - Diffusion Equation
Abstract:
The therapeutic utility of certain Auger emitters such as iodine-125 depends on their position within the cell nucleus . Or diagnostically, and to maintain as low as possible cell damage, it is preferable to have radionuclide localized outside the cell or at least the core. One solution to this problem is to consider markers capable of conveying anticancer drugs to the tumor site regardless of their location within the human body. The objective of this study is to simulate the impact of a complex such as bleomycin on single and double strand breaks in the DNA molecule. Indeed, this simulation consists of the following transactions: - Construction of BLM -Fe- DNA complex. - Simulation of the electron’s transport from the metastable state excitation of Fe 57 by the Monte Carlo method. - Treatment of chemical reactions in the considered environment by the diffusion equation. For physical, physico-chemical and finally chemical steps, the geometry of the complex is considered as a sphere of 50 nm centered on the binding site , and the mathematical method used is called step by step based on Monte Carlo codes.Keywords: concentration, yield, radical species, bleomycin, excitation, DNA
Procedia PDF Downloads 45717111 Variational Evolutionary Splines for Solving a Model of Temporomandibular Disorders
Authors: Alberto Hananel
Abstract:
The aim of this work is to modelize the occlusion of a person with temporomandibular disorders as an evolutionary equation and approach its solution by the construction and characterizing of discrete variational splines. To formulate the problem, certain boundary conditions have been considered. After showing the existence and the uniqueness of the solution of such a problem, a convergence result of a discrete variational evolutionary spline is shown. A stress analysis of the occlusion of a human jaw with temporomandibular disorders by finite elements is carried out in FreeFem++ in order to prove the validity of the presented method.Keywords: approximation, evolutionary PDE, Finite Element Method, temporomandibular disorders, variational spline
Procedia PDF Downloads 37817110 Direct Strength Method Approach for Indian Cold Formed Steel Sections with and Without Perforation for Compression Member
Authors: K. Raghu, Altafhusen P. Pinjar
Abstract:
Cold-formed steel section are extensively used in industry and many other non-industry constructions worldwide, it is relatively a new concept in India. Cold-formed steel sections have been developed as more economical building solutions to the alternative heavier hot-rolled sections in the commercial and residential markets. Cold‐formed steel (CFS) structural members are commonly manufactured with perforations to accommodate plumbing, electrical, and heating conduits in the walls and ceilings of buildings. Current design methods available to engineers for predicting the strength of CFS members with perforations are prescriptive and limited to specific perforation locations, spacing, and sizes. The Direct Strength Method (DSM), a relatively new design method for CFS members validated for members with and without perforations, predicts the ultimate strength of general CFS members with the elastic buckling properties of the member cross section. The design compression strength and flexural strength of Indian (IS 811-1987) standard sections is calculated as per North American Specification (AISI-S100 2007) and software CUFSM 4.05.Keywords: direct strength, cold formed, perforations, CUFSM
Procedia PDF Downloads 37917109 Comparison of Analytical Method and Software for Analysis of Flat Slab Subjected to Various Parametric Loadings
Authors: Hema V. Vanar, R. K. Soni, N. D. Shah
Abstract:
Slabs supported directly on columns without beams are known as Flat slabs. Flat slabs are highly versatile elements widely used in construction, providing minimum depth, fast construction and allowing flexible column grids. The main objective of this thesis is comparison of analytical method and soft ware for analysis of flat slab subjected to various parametric loadings. Study presents analysis of flat slab is performed under different types of gravity.Keywords: fat slab, parametric load, analysis, software
Procedia PDF Downloads 49317108 Enhancing the Network Security with Gray Code
Authors: Thomas Adi Purnomo Sidhi
Abstract:
Nowadays, network is an essential need in almost every part of human daily activities. People now can seamlessly connect to others through the Internet. With advanced technology, our personal data now can be more easily accessed. One of many components we are concerned for delivering the best network is a security issue. This paper is proposing a method that provides more options for security. This research aims to improve network security by focusing on the physical layer which is the first layer of the OSI model. The layer consists of the basic networking hardware transmission technologies of a network. With the use of observation method, the research produces a schematic design for enhancing the network security through the gray code converter.Keywords: network, network security, grey code, physical layer
Procedia PDF Downloads 50417107 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance
Authors: Rajinder Singh, Ram Valluru
Abstract:
Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.Keywords: actuarial loss reserving techniques, logistic regression, parametric function, volatility
Procedia PDF Downloads 13017106 STC Parameters versus Real Time Measured Parameters to Determine Cost Effectiveness of PV Panels
Authors: V. E. Selaule, R. M. Schoeman H. C. Z. Pienaar
Abstract:
Research has shown that solar energy is a renewable energy resource with the most potential when compared to other renewable energy resources in South Africa. There are many makes of Photovoltaic (PV) panels on the market and it is difficult to assess which to use. PV panel manufacturers use Standard Test Conditions (STC) to rate their PV panels. STC conditions are different from the actual operating environmental conditions were the PV panels are used. This paper describes a practical method to determine the most cost effective available PV panel. The method shows that PV panel manufacturer STC ratings cannot be used to select a cost effective PV panel.Keywords: PV orientation, PV panel, PV STC, Solar energy
Procedia PDF Downloads 47217105 Maintaining User-Level Security in Short Message Service
Authors: T. Arudchelvam, W. W. E. N. Fernando
Abstract:
Mobile phone has become as an essential thing in our life. Therefore, security is the most important thing to be considered in mobile communication. Short message service is the cheapest way of communication via the mobile phones. Therefore, security is very important in the short message service as well. This paper presents a method to maintain the security at user level. Different types of encryption methods are used to implement the user level security in mobile phones. Caesar cipher, Rail Fence, Vigenere cipher and RSA are used as encryption methods in this work. Caesar cipher and the Rail Fence methods are enhanced and implemented. The beauty in this work is that the user can select the encryption method and the key. Therefore, by changing the encryption method and the key time to time, the user can ensure the security of messages. By this work, while users can safely send/receive messages, they can save their information from unauthorised and unwanted people in their own mobile phone as well.Keywords: SMS, user level security, encryption, decryption, short message service, mobile communication
Procedia PDF Downloads 39617104 Factors Associated with Weight Loss Maintenance after an Intervention Program
Authors: Filipa Cortez, Vanessa Pereira
Abstract:
Introduction: The main challenge of obesity treatment is long-term weight loss maintenance. The 3 phases method is a weight loss program that combines a low carb and moderately high-protein diet, food supplements and a weekly one-to-one consultation with a certified nutritionist. Sustained weight control is the ultimate goal of phase 3. Success criterion was the minimum loss of 10% of initial weight and its maintenance after 12 months. Objective: The aim of this study was to identify factors associated with successful weight loss maintenance after 12 months at the end of 3 phases method. Methods: The study included 199 subjects that achieved their weight loss goal (phase 3). Weight and body mass index (BMI) were obtained at the baseline and every week until the end of the program. Therapeutic adherence was measured weekly on a Likert scale from 1 to 5. Subjects were considered in compliance with nutritional recommendation and supplementation when their classification was ≥ 4. After 12 months of the method, the current weight and number of previous weight-loss attempts were collected by telephone interview. The statistical significance was assumed at p-values < 0.05. Statistical analyses were performed using SPSS TM software v.21. Results: 65.3% of subjects met the success criterion. The factors which displayed a significant weight loss maintenance prediction were: greater initial percentage weight loss (OR=1.44) during the weight loss intervention and a higher number of consultations in phase 3 (OR=1.10). Conclusion: These findings suggest that the percentage weight loss during the weight loss intervention and the number of consultations in phase 3 may facilitate maintenance of weight loss after the 3 phases method.Keywords: obesity, weight maintenance, low-carbohydrate diet, dietary supplements
Procedia PDF Downloads 15017103 Use of Quasi-3D Inversion of VES Data Based on Lateral Constraints to Characterize the Aquifer and Mining Sites of an Area Located in the North-East of Figuil, North Cameroon
Authors: Fofie Kokea Ariane Darolle, Gouet Daniel Hervé, Koumetio Fidèle, Yemele David
Abstract:
The electrical resistivity method is successfully used in this paper in order to have a clearer picture of the subsurface of the North-East ofFiguil in northern Cameroon. It is worth noting that this method is most often used when the objective of the study is to image the shallow subsoils by considering them as a set of stratified ground layers. The problem to be solved is very often environmental, and in this case, it is necessary to perform an inversion of the data in order to have a complete and accurate picture of the parameters of the said layers. In the case of this work, thirty-three (33) Schlumberger VES have been carried out on an irregular grid to investigate the subsurface of the study area. The 1D inversion applied as a preliminary modeling tool and in correlation with the mechanical drillings results indicates a complex subsurface lithology distribution mainly consisting of marbles and schists. Moreover, the quasi-3D inversion with lateral constraint shows that the misfit between the observed field data and the model response is quite good and acceptable with a value low than 10%. The method also reveals existence of two water bearing in the considered area. The first is the schist or weathering aquifer (unsuitable), and the other is the marble or the fracturing aquifer (suitable). The final quasi 3D inversion results and geological models indicate proper sites for groundwaters prospecting and for mining exploitation, thus allowing the economic development of the study area.Keywords: electrical resistivity method, 1D inversion, quasi 3D inversion, groundwaters, mining
Procedia PDF Downloads 15517102 Laban Movement Analysis Using Kinect
Authors: Bernstein Ran, Shafir Tal, Tsachor Rachelle, Studd Karen, Schuster Assaf
Abstract:
Laban Movement Analysis (LMA), developed in the dance community over the past seventy years, is an effective method for observing, describing, notating, and interpreting human movement to enhance communication and expression in everyday and professional life. Many applications that use motion capture data might be significantly leveraged if the Laban qualities will be recognized automatically. This paper presents an automated recognition method of Laban qualities from motion capture skeletal recordings and it is demonstrated on the output of Microsoft’s Kinect V2 sensor.Keywords: Laban movement analysis, multitask learning, Kinect sensor, machine learning
Procedia PDF Downloads 34117101 Potential of Mineral Composition Reconstruction for Monitoring the Performance of an Iron Ore Concentration Plant
Authors: Maryam Sadeghi, Claude Bazin, Daniel Hodouin, Laura Perez Barnuevo
Abstract:
The performance of a separation process is usually evaluated using performance indices calculated from elemental assays readily available from the chemical analysis laboratory. However, the separation process performance is essentially related to the properties of the minerals that carry the elements and not those of the elements. Since elements or metals can be carried by valuable and gangue minerals in the ore and that each mineral responds differently to a mineral processing method, the use of only elemental assays could lead to erroneous or uncertain conclusions on the process performance. This paper discusses the advantages of using performance indices calculated from minerals content, such as minerals recovery, for process performance assessments. A method is presented that uses elemental assays to estimate the minerals content of the solids in various process streams. The method combines the stoichiometric composition of the minerals and constraints of mass conservation for the minerals through the concentration process to estimate the minerals content from elemental assays. The advantage of assessing a concentration process using mineral based performance indices is illustrated for an iron ore concentration circuit.Keywords: data reconciliation, iron ore concentration, mineral composition, process performance assessment
Procedia PDF Downloads 21817100 Three-Dimensional Computer Graphical Demonstration of Calcified Tissue and Its Clinical Significance
Authors: Itsuo Yokoyama, Rikako Kikuti, Miti Sekikawa, Tosinori Asai, Sarai Tsuyoshi
Abstract:
Introduction: Vascular access for hemodialysis therapy is often difficult, even for experienced medical personnel. Ultrasound guided needle placement have been performed occasionally but is not always helpful in certain cases with complicated vascular anatomy. Obtaining precise anatomical knowledge of the vascular structure is important to prevent access-related complications. With augmented reality (AR) device such as AR glasses, the virtual vascular structure is shown superimposed on the actual patient vessels, thus enabling the operator to maneuver catheter placement easily with free both hands. We herein report our method of AR guided vascular access method in dialysis treatment Methods: Three dimensional (3D) object of the arm with arteriovenous fistula is computer graphically created with 3D software from the data obtained by computer tomography, ultrasound echogram, and image scanner. The 3D vascular object thus created is viewed on the screen of the AR digital display device (such as AR glass or iPad). The picture of the vascular anatomical structure becomes visible, which is superimposed over the real patient’s arm, thereby the needle insertion be performed under the guidance of AR visualization with ease. By this method, technical difficulty in catheter placement for dialysis can be lessened and performed safely. Considerations: Virtual reality technology has been applied in various fields and medical use is not an exception. Yet AR devices have not been widely used among medical professions. Visualization of the virtual vascular object can be achieved by creation of accurate three dimensional object with the help of computer graphical technique. Although our experience is limited, this method is applicable with relative easiness and our accumulating evidence has suggested that our method of vascular access with the use of AR can be promising.Keywords: abdominal-aorta, calcification, extraskeletal, dialysis, computer graphics, 3DCG, CT, calcium, phosphorus
Procedia PDF Downloads 16317099 Intrusion Detection System Using Linear Discriminant Analysis
Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou
Abstract:
Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99
Procedia PDF Downloads 22617098 Dynamic Test for Sway-Mode Buckling of Columns
Authors: Boris Blostotsky, Elia Efraim
Abstract:
Testing of columns in sway mode is performed in order to determine the maximal allowable load limited by plastic deformations or their end connections and a critical load limited by columns stability. Motivation to determine accurate value of critical force is caused by its using as follow: - critical load is maximal allowable load for given column configuration and can be used as criterion of perfection; - it is used in calculation prescribed by standards for design of structural elements under combined action of compression and bending; - it is used for verification of theoretical analysis of stability at various end conditions of columns. In the present work a new non-destructive method for determination of columns critical buckling load in sway mode is proposed. The method allows performing measurements during the tests under loads that exceeds the columns critical load without losing its stability. The possibility of such loading is achieved by structure of the loading system. The system is performed as frame with rigid girder, one of the columns is the tested column and the other is additional two-hinged strut. Loading of the frame is carried out by the flexible traction element attached to the girder. The load applied on the tested column can achieve a values that exceed the critical load by choice of parameters of the traction element and the additional strut. The system lateral stiffness and the column critical load are obtained by the dynamic method. The experiment planning and the comparison between the experimental and theoretical values were performed based on the developed dependency of lateral stiffness of the system on vertical load, taking into account a semi-rigid connections of the column's ends. The agreement between the obtained results was established. The method can be used for testing of real full-size columns in industrial conditions.Keywords: buckling, columns, dynamic method, semi-rigid connections, sway mode
Procedia PDF Downloads 31317097 RBS Characteristic of Cd1−xZnxS Thin Film Fabricated by Vacuum Deposition Method
Authors: N. Dahbi, D. E. Arafah
Abstract:
Cd1−xZnxS thins films have been fabricated from ZnS/CdS/ZnS multilayer thin film systems, by using the vacuum deposition method; the Rutherford back-scattering (RBS) technique have been applied in order to determine the: structure, composition, depth profile, and stoichiometric of these films. The influence of the chemical and heat treatments on the produced films also have been investigated; the RBS spectra of the films showed that homogenous Cd1−xZnxS can be synthesized with x=0.45.Keywords: Cd1−xZnxS, chemical treatment, depth profile, heat treatment, RBS, RUMP simulation, thin film, vacuum deposition, ZnS/CdS/ZnS
Procedia PDF Downloads 22117096 Digital Structural Monitoring Tools @ADaPT for Cracks Initiation and Growth due to Mechanical Damage Mechanism
Authors: Faizul Azly Abd Dzubir, Muhammad F. Othman
Abstract:
Conventional structural health monitoring approach for mechanical equipment uses inspection data from Non-Destructive Testing (NDT) during plant shut down window and fitness for service evaluation to estimate the integrity of the equipment that is prone to crack damage. Yet, this forecast is fraught with uncertainty because it is often based on assumptions of future operational parameters, and the prediction is not continuous or online. Advanced Diagnostic and Prognostic Technology (ADaPT) uses Acoustic Emission (AE) technology and a stochastic prognostic model to provide real-time monitoring and prediction of mechanical defects or cracks. The forecast can help the plant authority handle their cracked equipment before it ruptures, causing an unscheduled shutdown of the facility. The ADaPT employs process historical data trending, finite element analysis, fitness for service, and probabilistic statistical analysis to develop a prediction model for crack initiation and growth due to mechanical damage. The prediction model is combined with live equipment operating data for real-time prediction of the remaining life span owing to fracture. ADaPT was devised at a hot combined feed exchanger (HCFE) that had suffered creep crack damage. The ADaPT tool predicts the initiation of a crack at the top weldment area by April 2019. During the shutdown window in April 2019, a crack was discovered and repaired. Furthermore, ADaPT successfully advised the plant owner to run at full capacity and improve output by up to 7% by April 2019. ADaPT was also used on a coke drum that had extensive fatigue cracking. The initial cracks are declared safe with ADaPT, with remaining crack lifetimes extended another five (5) months, just in time for another planned facility downtime to execute repair. The prediction model, when combined with plant information data, allows plant operators to continuously monitor crack propagation caused by mechanical damage for improved maintenance planning and to avoid costly shutdowns to repair immediately.Keywords: mechanical damage, cracks, continuous monitoring tool, remaining life, acoustic emission, prognostic model
Procedia PDF Downloads 7617095 Simultaneous Determination of Cefazolin and Cefotaxime in Urine by HPLC
Authors: Rafika Bibi, Khaled Khaladi, Hind Mokran, Mohamed Salah Boukhechem
Abstract:
A high performance liquid chromatographic method with ultraviolet detection at 264nm was developed and validate for quantitative determination and separation of cefazolin and cefotaxime in urine, the mobile phase consisted of acetonitrile and phosphate buffer pH4,2(15 :85) (v/v) pumped through ODB 250× 4,6 mm, 5um column at a flow rate of 1ml/min, loop of 20ul. In this condition, the validation of this technique showed that it is linear in a range of 0,01 to 10ug/ml with a good correlation coefficient ( R>0,9997), retention time of cefotaxime, cefazolin was 9.0, 10.1 respectively, the statistical evaluation of the method was examined by means of within day (n=6) and day to day (n=5) and was found to be satisfactory with high accuracy and precision.Keywords: cefazolin, cefotaxime, HPLC, bioscience, biochemistry, pharmaceutical
Procedia PDF Downloads 36317094 Verification of the Necessity of Maintenance Anesthesia with Isoflurane after Induction with Tiletamine-Zolazepam in Dogs Using the Dixon's up-and-down Method
Authors: Sonia Lachowska, Agnieszka Antonczyk, Joanna Tunikowska, Pawel Kucharski, Bartlomiej Liszka
Abstract:
Isoflurane is one of the most commonly used anaesthetic gases in veterinary medicine. Due to its numerous side effects, intravenous anaesthesia is more often used. The combination of tiletamine with zolazepam has proved to be a safe and pharmacologically beneficial combination. Analgesic effect, fast induction time, effective myorelaxation, and smooth recovery are the main advantages of this combination of drugs. In the following study, the authors verified the necessity of isoflurane to maintain anaesthesia in dogs after the use of tiletamine-zolazepam for induction. 12 dogs were selected to the group with the inclusion criteria: ASA (American Society of Anaesthesiology) I or II. Each dog received premedication intramuscularly with medetomidine-butorfanol (10 μg/kg, 0,1 mg/kg respectively). 15 minutes from premedication, preoxygenation lasting 5 minutes was started. Anaesthesia was induced with tiletamine-zolazepam at the dose of 5 mg/kg. Then the dogs were intubated and anaesthesia was maintained with isoflurane. Initially, MAC (Minimum Alveolar Concentration) was set to 0.7 vol.%. After 15 minutes equilibration, MAC was determined using Dixon’s up-and-down method. Painful stimulation including compressions of paw pad, phalange, groin area, and clamping Backhaus on skin. Hemodynamic and ventilation parameters were measured and noted in 2 minutes intervals. In this method, the positive or negative response to the noxious stimulus is estimated and then used to determine the concentration of isoflurane for next patient. The response is only assessed once in each patient. The results show that isoflurane is not necessary to maintain anaesthesia after tiletamine-zolazepam induction. This is clinically important because the side effects resulting from using isoflurane are eliminated.Keywords: anaesthesia, dog, Isoflurane, The Dixon's up-and-down method, Tiletamine, Zolazepam
Procedia PDF Downloads 18317093 Sol-Gel Derived ZnO Nanostructures: Optical Properties
Authors: Sheo K. Mishra, Rajneesh K. Srivastava, R. K. Shukla
Abstract:
In the present work, we report on the optical properties including UV-vis absorption and photoluminescence (PL) of ZnO nanostructures synthesized by sol-gel method. Structural and morphological investigations have been performed by X-ray diffraction method (XRD) and scanning electron microscopy (SEM). The XRD result confirms the formation of hexagonal wurtzite phase of ZnO nanostructures. The presence of various diffraction peaks suggests polycrystalline nature. The XRD pattern exhibits no additional peak due to by-products such as Zn(OH)2. The average crystallite size of prepared ZnO sample corresponding to the maximum intensity peaks is to be ~38.22 nm. The SEM micrograph shows different nanostructures of pure ZnO. Photoluminescence (PL) spectrum shows several emission peaks around 353 nm, 382 nm, 419 nm, 441 nm, 483 nm and 522 nm. The obtained results suggest that the prepared phosphors are quite suitable for optoelectronic applications.Keywords: ZnO, sol-gel, XRD, PL
Procedia PDF Downloads 40017092 Prediction and Analysis of Human Transmembrane Transporter Proteins Based on SCM
Authors: Hui-Ling Huang, Tamara Vasylenko, Phasit Charoenkwan, Shih-Hsiang Chiu, Shinn-Ying Ho
Abstract:
The knowledge of the human transporters is still limited due to technically demanding procedure of crystallization for the structural characterization of transporters by spectroscopic methods. It is desirable to develop bioinformatics tools for effective analysis of available sequences in order to identify human transmembrane transporter proteins (HMTPs). This study proposes a scoring card method (SCM) based method for predicting HMTPs. We estimated a set of propensity scores of dipeptides to be HMTPs using SCM from the training dataset (HTS732) consisting of 366 HMTPs and 366 non-HMTPs. SCM using the estimated propensity scores of 20 amino acids and 400 dipeptides -as HMTPs, has a training accuracy of 87.63% and a test accuracy of 66.46%. The five top-ranked dipeptides include LD, NV, LI, KY, and MN with scores 996, 992, 989, 987, and 985, respectively. Five amino acids with the highest propensity scores are Ile, Phe, Met, Gly, and Leu, that hydrophobic residues are mostly highly-scored. Furthermore, obtained propensity scores were used to analyze physicochemical properties of human transporters.Keywords: dipeptide composition, physicochemical property, human transmembrane transporter proteins, human transmembrane transporters binding propensity, scoring card method
Procedia PDF Downloads 36917091 Frequency Identification of Wiener-Hammerstein Systems
Authors: Brouri Adil, Giri Fouad
Abstract:
The problem of identifying Wiener-Hammerstein systems is addressed in the presence of two linear subsystems of structure totally unknown. Presently, the nonlinear element is allowed to be noninvertible. The system identification problem is dealt by developing a two-stage frequency identification method such a set of points of the nonlinearity are estimated first. Then, the frequency gains of the two linear subsystems are determined at a number of frequencies. The method involves Fourier series decomposition and only requires periodic excitation signals. All involved estimators are shown to be consistent.Keywords: Wiener-Hammerstein systems, Fourier series expansions, frequency identification, automation science
Procedia PDF Downloads 53617090 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 33