Search results for: modified Navier method
18281 Reliability Qualification Test Plan Derivation Method for Weibull Distributed Products
Authors: Ping Jiang, Yunyan Xing, Dian Zhang, Bo Guo
Abstract:
The reliability qualification test (RQT) is widely used in product development to qualify whether the product meets predetermined reliability requirements, which are mainly described in terms of reliability indices, for example, MTBF (Mean Time Between Failures). It is widely exercised in product development. In engineering practices, RQT plans are mandatorily referred to standards, such as MIL-STD-781 or GJB899A-2009. But these conventional RQT plans in standards are not preferred, as the test plans often require long test times or have high risks for both producer and consumer due to the fact that the methods in the standards only use the test data of the product itself. And the standards usually assume that the product is exponentially distributed, which is not suitable for a complex product other than electronics. So it is desirable to develop an RQT plan derivation method that safely shortens test time while keeping the two risks under control. To meet this end, for the product whose lifetime follows Weibull distribution, an RQT plan derivation method is developed. The merit of the method is that expert judgment is taken into account. This is implemented by applying the Bayesian method, which translates the expert judgment into prior information on product reliability. Then producer’s risk and the consumer’s risk are calculated accordingly. The procedures to derive RQT plans are also proposed in this paper. As extra information and expert judgment are added to the derivation, the derived test plans have the potential to shorten the required test time and have satisfactory low risks for both producer and consumer, compared with conventional test plans. A case study is provided to prove that when using expert judgment in deriving product test plans, the proposed method is capable of finding ideal test plans that not only reduce the two risks but also shorten the required test time as well.Keywords: expert judgment, reliability qualification test, test plan derivation, producer’s risk, consumer’s risk
Procedia PDF Downloads 13718280 Stress and Strain Analysis of Notched Bodies Subject to Non-Proportional Loadings
Authors: Ayhan Ince
Abstract:
In this paper, an analytical simplified method for calculating elasto-plastic stresses strains of notched bodies subject to non-proportional loading paths is discussed. The method was based on the Neuber notch correction, which relates the incremental elastic and elastic-plastic strain energy densities at the notch root and the material constitutive relationship. The validity of the method was presented by comparing computed results of the proposed model against finite element numerical data of notched shaft. The comparison showed that the model estimated notch-root elasto-plastic stresses strains with good accuracy using linear-elastic stresses. The prosed model provides more efficient and simple analysis method preferable to expensive experimental component tests and more complex and time consuming incremental non-linear FE analysis. The model is particularly suitable to perform fatigue life and fatigue damage estimates of notched components subjected to non-proportional loading paths.Keywords: elasto-plastic, stress-strain, notch analysis, nonprortional loadings, cyclic plasticity, fatigue
Procedia PDF Downloads 46618279 A Theoretical Study of Accelerating Neutrons in LINAC Using Magnetic Gradient Method
Authors: Chunduru Amareswara Prasad
Abstract:
The main aim of this proposal it to reveal the secrets of the universe by accelerating neutrons. The proposal idea in its abridged version speaks about the possibility of making neutrons accelerate with help of thermal energy and magnetic energy under controlled conditions. Which is helpful in revealing the hidden secrets of the universe namely dark energy and in finding properties of Higgs boson. The paper mainly speaks about accelerating neutrons to near velocity of light in a LINAC, using magnetic energy by magnetic pressurizers. The center of mass energy of two colliding neutron beams is 94 GeV (~0.5c) can be achieved using this method. The conventional ways to accelerate neutrons has some constraints in accelerating them electromagnetically as they need to be separated from the Tritium or Deuterium nuclei. This magnetic gradient method provides efficient and simple way to accelerate neutrons.Keywords: neutron, acceleration, thermal energy, magnetic energy, Higgs boson
Procedia PDF Downloads 32618278 A TFETI Domain Decompositon Solver for von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening
Authors: Martin Cermak, Stanislav Sysala
Abstract:
In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MATLAB.Keywords: isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution
Procedia PDF Downloads 42018277 Dynamic Wind Effects in Tall Buildings: A Comparative Study of Synthetic Wind and Brazilian Wind Standard
Authors: Byl Farney Cunha Junior
Abstract:
In this work the dynamic three-dimensional analysis of a 47-story building located in Goiania city when subjected to wind loads generated using both the Wind Brazilian code, NBR6123 (ABNT, 1988) and the Synthetic-Wind method is realized. To model the frames three different methodologies are used: the shear building model and both bi and three-dimensional finite element models. To start the analysis, a plane frame is initially studied to validate the shear building model and, in order to compare the results of natural frequencies and displacements at the top of the structure the same plane frame was modeled using the finite element method through the SAP2000 V10 software. The same steps were applied to an idealized 20-story spacial frame that helps in the presentation of the stiffness correction process applied to columns. Based on these models the two methods used to generate the Wind loads are presented: a discrete model proposed in the Wind Brazilian code, NBR6123 (ABNT, 1988) and the Synthetic-Wind method. The method uses the Davenport spectrum which is divided into a variety of frequencies to generate the temporal series of loads. Finally, the 47- story building was analyzed using both the three-dimensional finite element method through the SAP2000 V10 software and the shear building model. The models were loaded with Wind load generated by the Wind code NBR6123 (ABNT, 1988) and by the Synthetic-Wind method considering different wind directions. The displacements and internal forces in columns and beams were compared and a comparative study considering a situation of a full elevated reservoir is realized. As can be observed the displacements obtained by the SAP2000 V10 model are greater when loaded with NBR6123 (ABNT, 1988) wind load related to the permanent phase of the structure’s response.Keywords: finite element method, synthetic wind, tall buildings, shear building
Procedia PDF Downloads 27318276 Improving Forecasting Demand for Maintenance Spare Parts: Case Study
Authors: Abdulaziz Afandi
Abstract:
Minimizing the inventory cost, optimizing the inventory quantities, and increasing system operational availability are the main motivations to enhance forecasting demand of spare parts in a major power utility company in Medina. This paper reports in an effort made to optimize the orders quantities of spare parts by improving the method of forecasting the demand. The study focuses on equipment that has frequent spare parts purchase orders with uncertain demand. The pattern of the demand considers a lumpy pattern which makes conventional forecasting methods less effective. A comparison was made by benchmarking various methods of forecasting based on experts’ criteria to select the most suitable method for the case study. Three actual data sets were used to make the forecast in this case study. Two neural networks (NN) approaches were utilized and compared, namely long short-term memory (LSTM) and multilayer perceptron (MLP). The results as expected, showed that the NN models gave better results than traditional forecasting method (judgmental method). In addition, the LSTM model had a higher predictive accuracy than the MLP model.Keywords: neural network, LSTM, MLP, forecasting demand, inventory management
Procedia PDF Downloads 12718275 Circuit Models for Conducted Susceptibility Analyses of Multiconductor Shielded Cables
Authors: Saih Mohamed, Rouijaa Hicham, Ghammaz Abdelilah
Abstract:
This paper presents circuit models to analyze the conducted susceptibility of multiconductor shielded cables in frequency domains using Branin’s method, which is referred to as the method of characteristics. These models, Which can be used directly in the time and frequency domains, take into account the presence of both the transfer impedance and admittance. The conducted susceptibility is studied by using an injection current on the cable shield as the source. Two examples are studied, a coaxial shielded cable and shielded cables with two parallel wires (i.e., twinax cables). This shield has an asymmetry (one slot on the side). Results obtained by these models are in good agreement with those obtained by other methods.Keywords: circuit models, multiconductor shielded cables, Branin’s method, coaxial shielded cable, twinax cables
Procedia PDF Downloads 51618274 Electronic and Optical Properties of Li₂S Antifluorite Material
Authors: Brahim Bahloul, Khatir Babesse, Azzedine Dkhira, Yacine Bahloul, Dalila Hammoutene
Abstract:
In this paper, we investigate with ab initio calculations some structural and optoelectronic properties of Li₂S compound. The structural and electronic properties of the Li₂S antifluorite structure have been studied by first-principles calculations within the density functional theory (DFT), whereas the optical properties have been obtained using empirical relationships such as the modified Moss relation. Our calculated lattice parameters are in good agreement with the experimental data and other theoretical calculations. The electronic band structures and density of states were obtained. The anti-fluorite Li₂S present an indirect band gap of 3.388 eV at equilibrium. The top of the valence bands reflects the p electronic character for both structures. The calculated energy gaps and optical constants are in good agreement with experimental measurements.Keywords: Ab initio calculations, antifluorite, electronic properties, optical properties
Procedia PDF Downloads 29018273 Uplift Modeling Approach to Optimizing Content Quality in Social Q/A Platforms
Authors: Igor A. Podgorny
Abstract:
TurboTax AnswerXchange is a social Q/A system supporting users working on federal and state tax returns. Content quality and popularity in the AnswerXchange can be predicted with propensity models using attributes of the question and answer. Using uplift modeling, we identify features of questions and answers that can be modified during the question-asking and question-answering experience in order to optimize the AnswerXchange content quality. We demonstrate that adding details to the questions always results in increased question popularity that can be used to promote good quality content. Responding to close-ended questions assertively improve content quality in the AnswerXchange in 90% of cases. Answering knowledge questions with web links increases the likelihood of receiving a negative vote from 60% of the askers. Our findings provide a rationale for employing the uplift modeling approach for AnswerXchange operations.Keywords: customer relationship management, human-machine interaction, text mining, uplift modeling
Procedia PDF Downloads 24418272 The Mechanical Properties of a Small-Size Seismic Isolation Rubber Bearing for Bridges
Authors: Yi F. Wu, Ai Q. Li, Hao Wang
Abstract:
Taking a novel type of bridge bearings with the diameter being 100mm as an example, the theoretical analysis, the experimental research as well as the numerical simulation of the bearing were conducted. Since the normal compression-shear machines cannot be applied to the small-size bearing, an improved device to test the properties of the bearing was proposed and fabricated. Besides, the simulation of the bearing was conducted on the basis of the explicit finite element software ANSYS/LS-DYNA, and some parameters of the bearing are modified in the finite element model to effectively reduce the computation cost. Results show that all the research methods are capable of revealing the fundamental properties of the small-size bearings, and a combined use of these methods can better catch both the integral properties and the inner detailed mechanical behaviors of the bearing.Keywords: ANSYS/LS-DYNA, compression shear, contact analysis, explicit algorithm, small-size
Procedia PDF Downloads 18118271 Pandemic-Related Disruption to the Home Environment and Early Vocabulary Acquisition
Authors: Matthew McArthur, Margaret Friend
Abstract:
The COVID-19 pandemic disrupted the stability of the home environment for families across the world. Potential disruptions include parent work modality (in-person vs. remote), levels of health anxiety, family routines, and caregiving. These disruptions may have interfered with the processes of early vocabulary acquisition, carrying lasting effects over the life course. Our justification for this research is as follows: First, early, stable, caregiver-child reciprocal interactions, which may have been disrupted during the pandemic, contribute to the development of the brain architecture that supports language, cognitive, and social-emotional development. Second, early vocabulary predicts several cognitive outcomes, such as numeracy, literacy, and executive function. Further, disruption in the home is associated with adverse cognitive, academic, socio-emotional, behavioral, and communication outcomes in young children. We are interested in how disruptions related to the COVID-19 pandemic are associated with vocabulary acquisition in children born during the first two waves of the pandemic. We are conducting a moderated online experiment to assess this question. Participants are 16 children (10F) ranging in age from 19 to 39 months (M=25.27) and their caregivers. All child participants were screened for language background, health history, and history of language disorders, and were typically developing. Parents completed a modified version of the COVID-19 Family Stressor Scale (CoFaSS), a published measure of COVID-19-related family stressors. Thirteen items from the original scale were replaced to better capture change in family organization and stability specifically related to disruptions in income, anxiety, family relations, and childcare. Following completion of the modified CoFaSS, children completed a Web-Based version of the Computerized Comprehension Task and the Receptive One Word Picture Vocabulary if 24 months or older or the MacArthur-Bates Communicative Development Inventory if younger than 24 months. We report our preliminary data as a partial correlation analysis controlling for age. Raw vocabulary scores on the CCT, ROWPVT-4, and MCDI were all negatively associated with pandemic-related disruptions related to anxiety (r12=-.321; r1=-.332; r9=-.509), family relations (r12=-.590*; r1=-.155; r9=-.468), and childcare (r12=-.294; r1=-.468; r9=-.177). Although the small sample size for these preliminary data limits our power to detect significance, this trend is in the predicted direction, suggesting that increased pandemic-related disruption across multiple domains is associated with lower vocabulary scores. We anticipate presenting data on a full sample of 50 monolingual English participants. A sample of 50 participants would provide sufficient statistical power to detect a moderate effect size, adhering to a nominal alpha of 0.05 and ensuring a power level of 0.80.Keywords: COVID-19, early vocabulary, home environment, language acquisition, multiple measures
Procedia PDF Downloads 6218270 A Computational Study of the Electron Transport in HgCdTe Bulk Semiconductor
Abstract:
This paper deals with the use of computational method based on Monte Carlo simulation in order to investigate the transport phenomena of the electron in HgCdTe narrow band gap semiconductor. Via this method we can evaluate the time dependence of the transport parameters: velocity, energy and mobility of electrons through matter (HgCdTe).Keywords: Monte Carlo, transport parameters, HgCdTe, computational mechanics
Procedia PDF Downloads 47518269 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering
Authors: Emiel Caron
Abstract:
Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics
Procedia PDF Downloads 19418268 Finite Element Molecular Modeling: A Structural Method for Large Deformations
Authors: A. Rezaei, M. Huisman, W. Van Paepegem
Abstract:
Atomic interactions in molecular systems are mainly studied by particle mechanics. Nevertheless, researches have also put on considerable effort to simulate them using continuum methods. In early 2000, simple equivalent finite element models have been developed to study the mechanical properties of carbon nanotubes and graphene in composite materials. Afterward, many researchers have employed similar structural simulation approaches to obtain mechanical properties of nanostructured materials, to simplify interface behavior of fiber-reinforced composites, and to simulate defects in carbon nanotubes or graphene sheets, etc. These structural approaches, however, are limited to small deformations due to complicated local rotational coordinates. This article proposes a method for the finite element simulation of molecular mechanics. For ease in addressing the approach, here it is called Structural Finite Element Molecular Modeling (SFEMM). SFEMM method improves the available structural approaches for large deformations, without using any rotational degrees of freedom. Moreover, the method simulates molecular conformation, which is a big advantage over the previous approaches. Technically, this method uses nonlinear multipoint constraints to simulate kinematics of the atomic multibody interactions. Only truss elements are employed, and the bond potentials are implemented through constitutive material models. Because the equilibrium bond- length, bond angles, and bond-torsion potential energies are intrinsic material parameters, the model is independent of initial strains or stresses. In this paper, the SFEMM method has been implemented in ABAQUS finite element software. The constraints and material behaviors are modeled through two Fortran subroutines. The method is verified for the bond-stretch, bond-angle and bond-torsion of carbon atoms. Furthermore, the capability of the method in the conformation simulation of molecular structures is demonstrated via a case study of a graphene sheet. Briefly, SFEMM builds up a framework that offers more flexible features over the conventional molecular finite element models, serving the structural relaxation modeling and large deformations without incorporating local rotational degrees of freedom. Potentially, the method is a big step towards comprehensive molecular modeling with finite element technique, and thereby concurrently coupling an atomistic domain to a solid continuum domain within a single finite element platform.Keywords: finite element, large deformation, molecular mechanics, structural method
Procedia PDF Downloads 15218267 Preparation of Chromium Nanoparticles on Carbon Substrate from Tannery Waste Solution by Chemical Method Compared to Electrokinetic Process
Authors: Mahmoud A. Rabah, Said El Sheikh
Abstract:
This work shows the preparation of chromium nanoparticles from tannery waste solution on glassy carbon by chemical method compared to electrokinetic process. The waste solution contains free and soluble fats, calcium, iron, magnesium and high sodium in addition to the chromium ions. Filtration helps removal of insoluble matters. Diethyl ether successfully extracted soluble fats. The method started by removing calcium as insoluble oxalate salts at hot conditions in a faint acidic medium. The filtrate contains iron, magnesium, chromium ions and sodium chloride in excess. Chromium was separated selectively as insoluble hydroxide sol-gel at pH 6.5, filtered and washed with distilled water. Part of the gel reacted with sulfuric acid to produce chromium sulfate solution having 15-25 g/L concentration. Electrokinetic deposition of chromium nanoparticles on a carbon cathode was carried out using platinum anode under different galvanostatic conditions. The chemical method involved impregnating the carbon specimens with chromium hydroxide gel followed by reduction using hydrazine hydrate or by thermal reduction using hydrogen gas at 1250°C. Chromium grain size was characterized by TEM, FT-IR and SEM. Properties of the Cr grains were correlated to the conditions of the preparation process. Electrodeposition was found to control chromium particles to be more identical in size and shape as compared to the chemical method.Keywords: chromium, electrodeposition, nanoparticles, tannery waste solution
Procedia PDF Downloads 40918266 District Selection for Geotechnical Settlement Suitability Using GIS and Multi Criteria Decision Analysis: A Case Study in Denizli, Turkey
Authors: Erdal Akyol, Mutlu Alkan
Abstract:
Multi criteria decision analysis (MDCA) covers both data and experience. It is very common to solve the problems with many parameters and uncertainties. GIS supported solutions improve and speed up the decision process. Weighted grading as a MDCA method is employed for solving the geotechnical problems. In this study, geotechnical parameters namely soil type; SPT (N) blow number, shear wave velocity (Vs) and depth of underground water level (DUWL) have been engaged in MDCA and GIS. In terms of geotechnical aspects, the settlement suitability of the municipal area was analyzed by the method. MDCA results were compatible with the geotechnical observations and experience. The method can be employed in geotechnical oriented microzoning studies if the criteria are well evaluated.Keywords: GIS, spatial analysis, multi criteria decision analysis, geotechnics
Procedia PDF Downloads 45918265 Real-Time Measurement Approach for Tracking the ΔV10 Estimate Value of DC EAF
Authors: Jin-Lung Guan, Jyh-Cherng Gu, Chun-Wei Huang, Hsin-Hung Chang
Abstract:
This investigation develops a revisable method for estimating the estimate value of equivalent 10 Hz voltage flicker (DV10) of a DC Electric Arc Furnace (EAF). This study also discusses three 161kV DC EAFs by field measurement, with those results indicating that the estimated DV10 value is significantly smaller than the survey value. The key point is that the conventional means of estimating DV10 is inappropriate. There is a main cause as the assumed Qmax is too small. Although DC EAF is regularly operated in a constant MVA mode, the reactive power variation in the Main Transformer (MT) is more significant than that in the Furnace Transformer (FT). A substantial difference exists between estimated maximum reactive power fluctuation (DQmax) and the survey value from actual DC EAF operations. However, this study proposes a revisable method that can obtain a more accurate DV10 estimate than the conventional method.Keywords: voltage flicker, dc EAF, estimate value, DV10
Procedia PDF Downloads 44918264 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine
Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif
Abstract:
The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)
Procedia PDF Downloads 37218263 Novel Coprocessor for DNA Sequence Alignment in Resequencing Applications
Authors: Atef Ibrahim, Hamed Elsimary, Abdullah Aljumah, Fayez Gebali
Abstract:
This paper presents a novel semi-systolic array architecture for an optimized parallel sequence alignment algorithm. This architecture has the advantage that it can be modified to be reused for multiple pass processing in order to increase the number of processing elements that can be packed into a single FPGA and to increase the number of sequences that can be aligned in parallel in a single FPGA. This resolves the potential problem of many FPGA resources left unused for designs that have large values of short read length. When using the previously published conventional hardware design. FPGA implementation results show that, for large values of short read lengths (M>128), the proposed design has a slightly higher speed up and FPGA utilization over the the conventional one.Keywords: bioinformatics, genome sequence alignment, re-sequencing applications, systolic array
Procedia PDF Downloads 53118262 Keyframe Extraction Using Face Quality Assessment and Convolution Neural Network
Authors: Rahma Abed, Sahbi Bahroun, Ezzeddine Zagrouba
Abstract:
Due to the huge amount of data in videos, extracting the relevant frames became a necessity and an essential step prior to performing face recognition. In this context, we propose a method for extracting keyframes from videos based on face quality and deep learning for a face recognition task. This method has two steps. We start by generating face quality scores for each face image based on the use of three face feature extractors, including Gabor, LBP, and HOG. The second step consists in training a Deep Convolutional Neural Network in a supervised manner in order to select the frames that have the best face quality. The obtained results show the effectiveness of the proposed method compared to the methods of the state of the art.Keywords: keyframe extraction, face quality assessment, face in video recognition, convolution neural network
Procedia PDF Downloads 23418261 Evaluation of Reliability Indices Using Monte Carlo Simulation Accounting Time to Switch
Authors: Sajjad Asefi, Hossein Afrakhte
Abstract:
This paper presents the evaluation of reliability indices of an electrical distribution system using Monte Carlo simulation technique accounting Time To Switch (TTS) for each section. In this paper, the distribution system has been assumed by accounting random repair time omission. For simplicity, we have assumed the reliability analysis to be based on exponential law. Each segment has a specified rate of failure (λ) and repair time (r) which will give us the mean up time and mean down time of each section in distribution system. After calculating the modified mean up time (MUT) in years, mean down time (MDT) in hours and unavailability (U) in h/year, TTS have been added to the time which the system is not available, i.e. MDT. In this paper, we have assumed the TTS to be a random variable with Log-Normal distribution.Keywords: distribution system, Monte Carlo simulation, reliability, repair time, time to switch (TTS)
Procedia PDF Downloads 42718260 Effects of Roughness Elements on Heat Transfer During Natural Convection
Abstract:
The present study focused on the investigation of the effects of roughness elements on heat transfer during natural convection in a rectangular cavity using a numerical technique. Roughness elements were introduced on the bottom hot wall with a normalized amplitude (A*/H) of 0.1. Thermal and hydrodynamic behavior was studied using a computational method based on Lattice Boltzmann method (LBM). Numerical studies were performed for a laminar natural convection in the range of Rayleigh number (Ra) from 103 to 106 for a rectangular cavity of aspect ratio (L/H) 2 with a fluid of Prandtl number (Pr) 1.0. The presence of the sinusoidal roughness elements caused a minimum to the maximum decrease in the heat transfer as 7% to 17% respectively compared to the smooth enclosure. The results are presented for mean Nusselt number (Nu), isotherms, and streamlines.Keywords: natural convection, Rayleigh number, surface roughness, Nusselt number, Lattice Boltzmann method
Procedia PDF Downloads 54118259 Searching for Forensic Evidence in a Compromised Virtual Web Server against SQL Injection Attacks and PHP Web Shell
Authors: Gigih Supriyatno
Abstract:
SQL injection is one of the most common types of attacks and has a very critical impact on web servers. In the worst case, an attacker can perform post-exploitation after a successful SQL injection attack. In the case of forensics web servers, web server analysis is closely related to log file analysis. But sometimes large file sizes and different log types make it difficult for investigators to look for traces of attackers on the server. The purpose of this paper is to help investigator take appropriate steps to investigate when the web server gets attacked. We use attack scenarios using SQL injection attacks including PHP backdoor injection as post-exploitation. We perform post-mortem analysis of web server logs based on Hypertext Transfer Protocol (HTTP) POST and HTTP GET method approaches that are characteristic of SQL injection attacks. In addition, we also propose structured analysis method between the web server application log file, database application, and other additional logs that exist on the webserver. This method makes the investigator more structured to analyze the log file so as to produce evidence of attack with acceptable time. There is also the possibility that other attack techniques can be detected with this method. On the other side, it can help web administrators to prepare their systems for the forensic readiness.Keywords: web forensic, SQL injection, investigation, web shell
Procedia PDF Downloads 14818258 Sperm Flagellum Center-Line Tracing in 4D Stacks Using an Iterative Minimal Path Method
Authors: Paul Hernandez-Herrera, Fernando Montoya, Juan Manuel Rendon, Alberto Darszon, Gabriel Corkidi
Abstract:
Intracellular calcium ([Ca2+]i) regulates sperm motility. The analysis of [Ca2+]i has been traditionally achieved in two dimensions while the real movement of the cell takes place in three spatial dimensions. Due to optical limitations (high speed cell movement and low light emission) important data concerning the three dimensional movement of these flagellated cells had been neglected. Visualizing [Ca2+]i in 3D is not a simple matter since it requires complex fluorescence microscopy techniques where the resulting images have very low intensity and consequently low SNR (Signal to Noise Ratio). In 4D sequences, this problem is magnified since the flagellum oscillates (for human sperm) at least at an average frequency of 15 Hz. In this paper, a novel approach to extract the flagellum’s center-line in 4D stacks is presented. For this purpose, an iterative algorithm based on the fast-marching method is proposed to extract the flagellum’s center-line. Quantitative and qualitative results are presented in a 4D stack to demonstrate the ability of the proposed algorithm to trace the flagellum’s center-line. The method reached a precision and recall of 0.96 as compared with a semi-manual method.Keywords: flagellum, minimal path, segmentation, sperm
Procedia PDF Downloads 28418257 An Investigation into Computer Vision Methods to Identify Material Other Than Grapes in Harvested Wine Grape Loads
Authors: Riaan Kleyn
Abstract:
Mass wine production companies across the globe are provided with grapes from winegrowers that predominantly utilize mechanical harvesting machines to harvest wine grapes. Mechanical harvesting accelerates the rate at which grapes are harvested, allowing grapes to be delivered faster to meet the demands of wine cellars. The disadvantage of the mechanical harvesting method is the inclusion of material-other-than-grapes (MOG) in the harvested wine grape loads arriving at the cellar which degrades the quality of wine that can be produced. Currently, wine cellars do not have a method to determine the amount of MOG present within wine grape loads. This paper seeks to find an optimal computer vision method capable of detecting the amount of MOG within a wine grape load. A MOG detection method will encourage winegrowers to deliver MOG-free wine grape loads to avoid penalties which will indirectly enhance the quality of the wine to be produced. Traditional image segmentation methods were compared to deep learning segmentation methods based on images of wine grape loads that were captured at a wine cellar. The Mask R-CNN model with a ResNet-50 convolutional neural network backbone emerged as the optimal method for this study to determine the amount of MOG in an image of a wine grape load. Furthermore, a statistical analysis was conducted to determine how the MOG on the surface of a grape load relates to the mass of MOG within the corresponding grape load.Keywords: computer vision, wine grapes, machine learning, machine harvested grapes
Procedia PDF Downloads 9618256 Temporal Variation of Reference Evapotranspiration in Central Anatolia Region, Turkey and Meteorological Drought Analysis via Standardized Precipitation Evapotranspiration Index Method
Authors: Alper Serdar Anli
Abstract:
Analysis of temporal variation of reference evapotranspiration (ET0) is important in arid and semi-arid regions where water resources are limited. In this study, temporal variation of reference evapotranspiration (ET0) and meteorological drought analysis through SPEI (Standardized Precipitation Evapotranspiration Index) method have been carried out in provinces of Central Anatolia Region, Turkey. Reference evapotranspiration of concerning provinces in the region has been estimated using Penman-Monteith method and one calendar year has been split up four periods as r1, r2, r3 and r4. Temporal variation of reference evapotranspiration according to four periods has been analyzed through parametric Dickey-Fuller test and non-parametric Mann-Whitney U test. As a result, significant increasing trends for reference evapotranspiration have been detected and according to SPEI method used for estimating meteorological drought in provinces, mild drought has been experienced in general, and however there have been also a significant amount of events where moderate and severely droughts occurred.Keywords: central Anatolia region, drought index, Penman-Monteith, reference evapotranspiration, temporal variation
Procedia PDF Downloads 31218255 Thatsana Nataya Chatri Dance: A Creative Conservation Process of Cultural Performing Arts for Competition
Authors: Dusittorn Ngamying
Abstract:
The research on Thatsana Nataya Chatri Dance: A Creative Conservation Process of Cultural Performing Arts for Competition was aimed at 1) studying the creative conservation process of cultural performing arts; 2) creating conservation process of cultural performing arts of Thatsana Nataya Chatri dance; and 3) utilizing the created performing arts for the competition. The study was conducted using the qualitative research method in the Central region provinces of Thailand through documentary study and data from field observations, interviews and focus group meetings. Data were collected from 50 informants consisting of 10 experts on the subject, 30 practitioners and 10 general information providers. The data collection instruments consisted of participatory and non-participatory forms, structured and non-structured interview schedules and focus group note forms. The data were verified by the triangulation technique and presented using the descriptive analysis. The results of the study reveal that the creative conservation process of cultural performing arts should be initiated by those who have experienced using a prior knowledge in the pursuit of new knowledge. The new knowledge is combined to generate creative work with the conservation process in 9 aspects: acquiring the related knowledge, creating theme and inspiration, designing the music and melody, designing costumes, inventing dance postures, selecting dancers, transferring the dance postures, preparing the stage and performance equipment, planning the performance event. Inventing the conservation process of cultural performing arts Thatsana Nataya Chatri dance consists of 33 dance postures and 14 transformed patterns. The performance requires 6 dancers, 3 males and 3 females. Costume features both male and female classical and modified dancer’s costumes. The duration of the show takes 5 minutes. As for the application for the competition, this creative work has been selected by Dramatic Works Association (Thailand) to represent Thailand at the Lombok International Dance Sports Festival 2015 held at Lombok, Indonesia. The team has been awarded the Second Place in the Traditional Dance category.Keywords: creative conservation process, cultural performing arts, Thatsana Nataya Chatri dance, competition
Procedia PDF Downloads 22218254 Analytical Solution for Multi-Segmented Toroidal Shells under Uniform Pressure
Authors: Nosakhare Enoma, Alphose Zingoni
Abstract:
The requirements for various toroidal shell forms are increasing due to new applications, available storage space and the consideration of appearance. Because of the complexity of some of these structural forms, the finite element method is nowadays mainly used for their analysis, even for simple static studies. This paper presents an easy-to-use analytical algorithm for pressurized multi-segmented toroidal shells of revolution. The membrane solution, which acts as a particular solution of the bending-theory equations, is developed based on membrane theory of shells, and a general approach is formulated for quantifying discontinuity effects at the shell junctions using the well-known Geckeler’s approximation. On superimposing these effects, and applying the ensuing solution to the problem of the pressurized toroid with four segments, closed-form stress results are obtained for the entire toroid. A numerical example is carried out using the developed method. The analytical results obtained show excellent agreement with those from the finite element method, indicating that the proposed method can be also used for complementing and verifying FEM results, and providing insights on other related problems.Keywords: bending theory of shells, membrane hypothesis, pressurized toroid, segmented toroidal vessel, shell analysis
Procedia PDF Downloads 32018253 Optimal Trajectories for Highly Automated Driving
Authors: Christian Rathgeber, Franz Winkler, Xiaoyu Kang, Steffen Müller
Abstract:
In this contribution two approaches for calculating optimal trajectories for highly automated vehicles are presented and compared. The first one is based on a non-linear vehicle model, used for evaluation. The second one is based on a simplified model and can be implemented on a current ECU. In usual driving situations both approaches show very similar results.Keywords: trajectory planning, direct method, indirect method, highly automated driving
Procedia PDF Downloads 53318252 A Simple Light-Outcoupling Enhancement Method for Organic Light-Emitting Diodes
Authors: Ho-Nyeon Lee
Abstract:
We propose to use a gradual-refractive-index dielectric (GRID) as a simple and efficient light-outcoupling method for organic light-emitting diodes (OLEDs). Using the simple GRIDs, we could improve the light outcoupling efficiency of OLEDs rather than relying on difficult nano-patterning processes. Through numerical simulations using a finite-difference time-domain (FDTD) method, the feasibility of the GRID structure was examined and the design parameters were extracted. The outcoupling enhancement effects due to the GRIDs were proved through severe experimental works. The GRIDs were adapted to bottom-emission OLEDs and top-emission OLEDs. For bottom-emission OLEDs, the efficiency was improved more than 20%, and for top-emission OLEDs, more than 40%. The detailed numerical and experimental results will be presented at the conference site.Keywords: efficiency, GRID, light outcoupling, OLED
Procedia PDF Downloads 423