Search results for: charging methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15033

Search results for: charging methods

14193 Creativity in Industrial Design as an Instrument for the Achievement of the Proper and Necessary Balance between Intuition and Reason, Design and Science

Authors: Juan Carlos Quiñones

Abstract:

Time has passed since the industrial design has put murder on a mass-production basis. The industrial design applies methods from different disciplines with a strategic approach, to place humans at the centers of the design process and to deliver solutions that are meaningful and desirable for users and for the market. This analysis summarizes some of the discussions that occurred in the 6th International Forum of Design as a Process, June 2016, Valencia. The aims of this conference were finding new linkages between systems and design interactions in order to define the social consequences. Through knowledge management we are able to transform the intangible aspect by using design as a transforming function capable of converting intangible knowledge into tangible solutions (i.e. products and services demanded by society). Industrial designers use knowledge consciously as a starting point for the ideation of the product. The handling of the intangible becomes more and more relevant over time as different methods emerge for knowledge extraction and subsequent organization. The different methodologies applied to the industrial design discipline and the evolution of the same discipline methods underpin the cultural and scientific background knowledge as a starting point of thought as a response to the needs; the whole thing coming through the instrument of creativity for the achievement of the proper and necessary balance between intuition and reason, design and science.

Keywords: creative process, creativity, industrial design, intangible

Procedia PDF Downloads 275
14192 Active Cyber Defense within the Concept of NATO’s Protection of Critical Infrastructures

Authors: Serkan Yağlı, Selçuk Dal

Abstract:

Cyber-attacks pose a serious threat to all states. Therefore, states constantly seek for various methods to encounter those threats. In addition, recent changes in the nature of cyber-attacks and their more complicated methods have created a new concept: active cyber defence (ACD). This article tries to answer firstly why ACD is important to NATO and find out the viewpoint of NATO towards ACD. Secondly, infrastructure protection is essential to cyber defence. Critical infrastructure protection with ACD means is even more important. It is assumed that by implementing active cyber defence, NATO may not only be able to repel the attacks but also be deterrent. Hence, the use of ACD has a direct positive effect in all international organizations’ future including NATO.

Keywords: active cyber defence, advanced persistent treat, critical infrastructure, NATO

Procedia PDF Downloads 232
14191 Sterilization of Potato Explants for in vitro Propagation

Authors: D. R. Masvodza, G. Coetzer, E. van der Watt

Abstract:

Microorganisms usually have a prolific growth nature and may cause major problems on in-vitro cultures. For in vitro propagation to be successful explants need to be sterile. In order to determine the best sterilization method for potato explants cv. Amerthyst, five sterilization methods were applied separately to 24 shoots. The first sterilization method was the use of 20% sodium hypochlorite with 1 ml Tween 20 for 15 minutes. The second, third and fourth sterilization methods were the immersion of explants in 70% ethanol in a beaker for either 30 seconds, 1 minute or 2 minutes, followed by 1% sodium hypochlorite with 1 ml Tween 20 for 5 minutes. For the control treatment, no chemicals were used. Finally, all the explants were rinsed three times with autoclaved distilled water and trimmed to 1-2 cm. Explants were then cultured on MS medium with 0.01 mg L-1 NAA and 0.1 mg L-1 GA3 and supplemented with 2 mg L-1 D-calcium pentothenate. The trial was laid out as a complete randomized design, and each treatment combination was replicated 24 times. At 7, 14 and 21 days after culture, data on explant color, survival, and presence or absence of contamination was recorded. Best results were obtained when 20% sodium hypochlorite was used with 1 ml Tween 20 for 15 minutes which is sterilization method 1. Method 2 was comparable to method 1 when explants were cultured in glass vessels. Explants in glass vessels were significantly less contaminated than explants in polypropylene vessel. Therefore at times, ideal methods for sterilization should be coupled with ideal culture conditions such as good quality culture vessel, rather than the addition of more stringent sterilants.

Keywords: culture containers, explants, sodium hypochlororite, sterilization

Procedia PDF Downloads 303
14190 Comparisons between Student Leaning Achievements and Their Problem Solving Skills on Stoichiometry Issue with the Think-Pair-Share Model and Stem Education Method

Authors: P. Thachitasing, N. Jansawang, W. Rakrai, T. Santiboon

Abstract:

The aim of this study is to investigate of the comparing the instructional design models between the Think-Pair-Share and Conventional Learning (5E Inquiry Model) Processes to enhance students’ learning achievements and their problem solving skills on stoichiometry issue for concerning the 2-instructional method with a sample consisted of 80 students in 2 classes at the 11th grade level in Chaturaphak Phiman Ratchadaphisek School. Students’ different learning outcomes in chemistry classes with the cluster random sampling technique were used. Instructional Methods designed with the 40-experimenl student group by Think-Pair-Share process and the 40-controlling student group by the conventional learning (5E Inquiry Model) method. These learning different groups were obtained using the 5 instruments; the 5-lesson instructional plans of Think-Pair-Share and STEM Education Method, students’ learning achievements and their problem solving skills were assessed with the pretest and posttest techniques, students’ outcomes of their instructional the Think-Pair-Share (TPSM) and the STEM Education Methods were compared. Statistically significant was differences with the paired t-test and F-test between posttest and pretest technique of the whole students in chemistry classes were found, significantly. Associations between student learning outcomes in chemistry and two methods of their learning to students’ learning achievements and their problem solving skills also were found. The use of two methods for this study is revealed that the students perceive their learning achievements to their problem solving skills to be differently learning achievements in different groups are guiding practical improvements in chemistry classrooms to assist teacher in implementing effective approaches for improving instructional methods. Students’ learning achievements of mean average scores to their controlling group with the Think-Pair-Share Model (TPSM) are lower than experimental student group for the STEM education method, evidence significantly. The E1/E2 process were revealed evidence of 82.56/80.44, and 83.02/81.65 which results based on criteria are higher than of 80/80 standard level with the IOC, consequently. The predictive efficiency (R2) values indicate that 61% and 67% and indicate that 63% and 67% of the variances in chemistry classes to their learning achievements on posttest in chemistry classes of the variances in students’ problem solving skills to their learning achievements to their chemistry classrooms on Stoichiometry issue with the posttest were attributable to their different learning outcomes for the TPSM and STEMe instructional methods.

Keywords: comparisons, students’ learning achievements, think-pare-share model (TPSM), stem education, problem solving skills, chemistry classes, stoichiometry issue

Procedia PDF Downloads 233
14189 Numerical Studies for Standard Bi-Conjugate Gradient Stabilized Method and the Parallel Variants for Solving Linear Equations

Authors: Kuniyoshi Abe

Abstract:

Bi-conjugate gradient (Bi-CG) is a well-known method for solving linear equations Ax = b, for x, where A is a given n-by-n matrix, and b is a given n-vector. Typically, the dimension of the linear equation is high and the matrix is sparse. A number of hybrid Bi-CG methods such as conjugate gradient squared (CGS), Bi-CG stabilized (Bi-CGSTAB), BiCGStab2, and BiCGstab(l) have been developed to improve the convergence of Bi-CG. Bi-CGSTAB has been most often used for efficiently solving the linear equation, but we have seen the convergence behavior with a long stagnation phase. In such cases, it is important to have Bi-CG coefficients that are as accurate as possible, and the stabilization strategy, which stabilizes the computation of the Bi-CG coefficients, has been proposed. It may avoid stagnation and lead to faster computation. Motivated by a large number of processors in present petascale high-performance computing hardware, the scalability of Krylov subspace methods on parallel computers has recently become increasingly prominent. The main bottleneck for efficient parallelization is the inner products which require a global reduction. The resulting global synchronization phases cause communication overhead on parallel computers. The parallel variants of Krylov subspace methods reducing the number of global communication phases and hiding the communication latency have been proposed. However, the numerical stability, specifically, the convergence speed of the parallel variants of Bi-CGSTAB may become worse than that of the standard Bi-CGSTAB. In this paper, therefore, we compare the convergence speed between the standard Bi-CGSTAB and the parallel variants by numerical experiments and show that the convergence speed of the standard Bi-CGSTAB is faster than the parallel variants. Moreover, we propose the stabilization strategy for the parallel variants.

Keywords: bi-conjugate gradient stabilized method, convergence speed, Krylov subspace methods, linear equations, parallel variant

Procedia PDF Downloads 148
14188 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 76
14187 Regional Flood Frequency Analysis in Narmada Basin: A Case Study

Authors: Ankit Shah, R. K. Shrivastava

Abstract:

Flood and drought are two main features of hydrology which affect the human life. Floods are natural disasters which cause millions of rupees’ worth of damage each year in India and the whole world. Flood causes destruction in form of life and property. An accurate estimate of the flood damage potential is a key element to an effective, nationwide flood damage abatement program. Also, the increase in demand of water due to increase in population, industrial and agricultural growth, has let us know that though being a renewable resource it cannot be taken for granted. We have to optimize the use of water according to circumstances and conditions and need to harness it which can be done by construction of hydraulic structures. For their safe and proper functioning of hydraulic structures, we need to predict the flood magnitude and its impact. Hydraulic structures play a key role in harnessing and optimization of flood water which in turn results in safe and maximum use of water available. Mainly hydraulic structures are constructed on ungauged sites. There are two methods by which we can estimate flood viz. generation of Unit Hydrographs and Flood Frequency Analysis. In this study, Regional Flood Frequency Analysis has been employed. There are many methods for estimating the ‘Regional Flood Frequency Analysis’ viz. Index Flood Method. National Environmental and Research Council (NERC Methods), Multiple Regression Method, etc. However, none of the methods can be considered universal for every situation and location. The Narmada basin is located in Central India. It is drained by most of the tributaries, most of which are ungauged. Therefore it is very difficult to estimate flood on these tributaries and in the main river. As mentioned above Artificial Neural Network (ANN)s and Multiple Regression Method is used for determination of Regional flood Frequency. The annual peak flood data of 20 sites gauging sites of Narmada Basin is used in the present study to determine the Regional Flood relationships. Homogeneity of the considered sites is determined by using the Index Flood Method. Flood relationships obtained by both the methods are compared with each other, and it is found that ANN is more reliable than Multiple Regression Method for the present study area.

Keywords: artificial neural network, index flood method, multi layer perceptrons, multiple regression, Narmada basin, regional flood frequency

Procedia PDF Downloads 399
14186 Stress Corrosion Cracking, Parameters Affecting It, Problems Caused by It and Suggested Methods for Treatment: State of the Art

Authors: Adnan Zaid

Abstract:

Stress corrosion cracking (SCC) may be defined as a degradation of the mechanical properties of a material under the combined action of a tensile stress and corrosive environment of the susceptible material. It is a harmful phenomenon which might cause catastrophic fracture without a sign of prior warning. In this paper, the stress corrosion cracking, SCC, process, the parameters affecting it, and the different damages caused by it are given and discussed. Utilization of shot peening as a mean of enhancing the resistance of materials to SCC is given and discussed. Finally, a method for improving materials resistance to SCC by grain refining its structure by some refining elements prior to usage is suggested.

Keywords: stress corrosion cracking, parameters, damages, treatment methods

Procedia PDF Downloads 312
14185 Studies on the Proximate Composition and Functional Properties of Extracted Cocoyam Starch Flour

Authors: Adebola Ajayi, Francis B. Aiyeleye, Olakunke M. Makanjuola, Olalekan J. Adebowale

Abstract:

Cocoyam, a generic term for both xanthoma and colocasia, is a traditional staple root crop in many developing countries in Africa, Asia and the Pacific. It is mostly cultivated as food crop which is very rich in vitamin B6, magnesium and also in dietary fiber. The cocoyam starch is easily digested and often used for baby food. Drying food is a method of food preservation that removes enough moisture from the food so bacteria, yeast and molds cannot grow. It is a one of the oldest methods of preserving food. The effect of drying methods on the proximate composition and functional properties of extracted cocoyam starch flour were studied. Freshly harvested cocoyam cultivars at matured level were washed with portable water, peeled, washed and grated. The starch in the grated cocoyam was extracted, dried using sun drying, oven and cabinet dryers. The extracted starch flour was milled into flour using Apex mill and packed and sealed in low-density polyethylene film (LDPE) 75 micron thickness with Nylon sealing machine QN5-3200HI and kept for three months under ambient temperature before analysis. The result showed that the moisture content, ash, crude fiber, fat, protein and carbohydrate ranged from 6.28% to 12.8% 2.32% to 3.2%, 0.89% to 2.24%%, 1.89% to 2.91%, 7.30% to 10.2% and 69% to 83% respectively. The functional properties of the cocoyam starch flour ranged from 2.65ml/g to 4.84ml/g water absorption capacity, 1.95ml/g to 3.12ml/g oil absorption capacity, 0.66ml/g to 7.82ml/g bulk density and 3.82% to 5.30ml/g swelling capacity. Significant difference (P≥0.5) was not obtained across the various drying methods used. The drying methods provide extension to the shelf-life of the extracted cocoyam starch flour.

Keywords: cocoyam, extraction, oven dryer, cabinet dryer

Procedia PDF Downloads 276
14184 Patents as Indicators of Innovative Environment

Authors: S. Karklina, I. Erins

Abstract:

The main problem is that there is a very low innovation performance in Latvia. Since Latvia is a Member State of European Union, it also shall have to fulfill the set targets and to improve innovative results. Universities are one of the main performers to provide innovative capacity of country. University, industry and government need to cooperate for getting best results. The intellectual property is one of the indicators to determine innovation level in the country or organization and patents are one of the characteristics of intellectual property. The objective of the article is to determine indicators characterizing innovative environment in Latvia and influence of the development of universities on them. The methods that will be used in the article to achieve the objectives are quantitative and qualitative analysis of the literature, statistical data analysis, and graphical analysis methods.

Keywords: HEI, innovations, Latvia, patents

Procedia PDF Downloads 301
14183 The Effect of the Acquisition and Reconstruction Parameters in Quality of Spect Tomographic Images with Attenuation and Scatter Correction

Authors: N. Boutaghane, F. Z. Tounsi

Abstract:

Many physical and technological factors degrade the SPECT images, both qualitatively and quantitatively. For this, it is not always put into leading technological advances to improve the performance of tomographic gamma camera in terms of detection, collimation, reconstruction and correction of tomographic images methods. We have to master firstly the choice of various acquisition and reconstruction parameters, accessible to clinical cases and using the attenuation and scatter correction methods to always optimize quality image and minimized to the maximum dose received by the patient. In this work, an evaluation of qualitative and quantitative tomographic images is performed based on the acquisition parameters (counts per projection) and reconstruction parameters (filter type, associated cutoff frequency). In addition, methods for correcting physical effects such as attenuation and scatter degrading the image quality and preventing precise quantitative of the reconstructed slices are also presented. Two approaches of attenuation and scatter correction are implemented: the attenuation correction by CHANG method with a filtered back projection reconstruction algorithm and scatter correction by the subtraction JASZCZAK method. Our results are considered as such recommandation, which permits to determine the origin of the different artifacts observed both in quality control tests and in clinical images.

Keywords: attenuation, scatter, reconstruction filter, image quality, acquisition and reconstruction parameters, SPECT

Procedia PDF Downloads 431
14182 Aerodynamic Design an UAV and Stability Analysis with Method of Genetic Algorithm Optimization

Authors: Saul A. Torres Z., Eduardo Liceaga C., Alfredo Arias M.

Abstract:

We seek to develop a UAV for agricultural spraying at a maximum altitude of 5000 meters above sea level, with a payload of 100 liters of fumigant. For the developing the aerodynamic design of the aircraft is using computational tools such as the "Vortex Lattice Athena" software, "MATLAB", "ANSYS FLUENT", "XFoil" package among others. Also methods are being used structured programming, exhaustive analysis of optimization methods and search. The results have a very low margin of error, and the multi-objective problems can be helpful for future developments. Also we developed method for Stability Analysis (Lateral-Directional and Longitudinal).

Keywords: aerodynamics design, optimization, algorithm genetic, multi-objective problem, longitudinal stability, lateral-directional stability

Procedia PDF Downloads 576
14181 Evaluating the Performance of Color Constancy Algorithm

Authors: Damanjit Kaur, Avani Bhatia

Abstract:

Color constancy is significant for human vision since color is a pictorial cue that helps in solving different visions tasks such as tracking, object recognition, or categorization. Therefore, several computational methods have tried to simulate human color constancy abilities to stabilize machine color representations. Two different kinds of methods have been used, i.e., normalization and constancy. While color normalization creates a new representation of the image by canceling illuminant effects, color constancy directly estimates the color of the illuminant in order to map the image colors to a canonical version. Color constancy is the capability to determine colors of objects independent of the color of the light source. This research work studies the most of the well-known color constancy algorithms like white point and gray world.

Keywords: color constancy, gray world, white patch, modified white patch

Procedia PDF Downloads 300
14180 Characterization of Particle Charge from Aerosol Generation Process: Impact on Infrared Signatures and Material Reactivity

Authors: Erin M. Durke, Monica L. McEntee, Meilu He, Suresh Dhaniyala

Abstract:

Aerosols are one of the most important and significant surfaces in the atmosphere. They can influence weather, absorption, and reflection of light, and reactivity of atmospheric constituents. A notable feature of aerosol particles is the presence of a surface charge, a characteristic imparted via the aerosolization process. The existence of charge can complicate the interrogation of aerosol particles, so many researchers remove or neutralize aerosol particles before characterization. However, the charge is present in real-world samples, and likely has an effect on the physical and chemical properties of an aerosolized material. In our studies, we aerosolized different materials in an attempt to characterize the charge imparted via the aerosolization process and determine what impact it has on the aerosolized materials’ properties. The metal oxides, TiO₂ and SiO₂, were aerosolized expulsively and then characterized, using several different techniques, in an effort to determine the surface charge imparted upon the particles via the aerosolization process. Particle charge distribution measurements were conducted via the employment of a custom scanning mobility particle sizer. The results of the charge distribution measurements indicated that expulsive generation of 0.2 µm SiO₂ particles produced aerosols with upwards of 30+ charges on the surface of the particle. Determination of the degree of surface charging led to the use of non-traditional techniques to explore the impact of additional surface charge on the overall reactivity of the metal oxides, specifically TiO₂. TiO₂ was aerosolized, again expulsively, onto a gold-coated tungsten mesh, which was then evaluated with transmission infrared spectroscopy in an ultra-high vacuum environment. The TiO₂ aerosols were exposed to O₂, H₂, and CO, respectively. Exposure to O₂ resulted in a decrease in the overall baseline of the aerosol spectrum, suggesting O₂ removed some of the surface charge imparted during aerosolization. Upon exposure to H₂, there was no observable rise in the baseline of the IR spectrum, as is typically seen for TiO₂, due to the population of electrons into the shallow trapped states and subsequent promotion of the electrons into the conduction band. This result suggests that the additional charge imparted via aerosolization fills the trapped states, therefore no rise is seen upon exposure to H₂. Dosing the TiO₂ aerosols with CO showed no adsorption of CO on the surface, even at lower temperatures (~100 K), indicating the additional charge on the aerosol surface prevents the CO molecules from adsorbing to the TiO₂ surface. The results observed during exposure suggest that the additional charge imparted via aerosolization impacts the interaction with each probe gas.

Keywords: aerosols, charge, reactivity, infrared

Procedia PDF Downloads 113
14179 Variable Selection in a Data Envelopment Analysis Model by Multiple Proportions Comparison

Authors: Jirawan Jitthavech, Vichit Lorchirachoonkul

Abstract:

A statistical procedure using multiple comparisons test for proportions is proposed for variable selection in a data envelopment analysis (DEA) model. The test statistic in the multiple comparisons is the proportion of efficient decision making units (DMUs) in a DEA model. Three methods of multiple comparisons test for proportions: multiple Z tests with Bonferroni correction, multiple tests in 2Xc crosstabulation and the Marascuilo procedure, are used in the proposed statistical procedure of iteratively eliminating the variables in a backward manner. Two simulation populations of moderately and lowly correlated variables are used to compare the results of the statistical procedure using three methods of multiple comparisons test for proportions with the hypothesis testing of the efficiency contribution measure. From the simulation results, it can be concluded that the proposed statistical procedure using multiple Z tests for proportions with Bonferroni correction clearly outperforms the proposed statistical procedure using the remaining two methods of multiple comparisons and the hypothesis testing of the efficiency contribution measure.

Keywords: Bonferroni correction, efficient DMUs, Marascuilo procedure, Pastor et al. method, 2xc crosstabulation

Procedia PDF Downloads 293
14178 Field Scale Simulation Study of Miscible Water Alternating CO2 Injection Process in Fractured Reservoirs

Authors: Hooman Fallah

Abstract:

Vast amounts of world oil reservoirs are in natural fractured reservoirs. There are different methods for increasing recovery from fractured reservoirs. Miscible injection of water alternating CO2 is a good choice among this methods. In this method, water and CO2 slugs are injected alternatively in reservoir as miscible agent into reservoir. This paper studies water injection scenario and miscible injection of water and CO2 in a two dimensional, inhomogeneous fractured reservoir. The results show that miscible water alternating CO2¬ gas injection leads to 3.95% increase in final oil recovery and total water production decrease of 3.89% comparing to water injection scenario.

Keywords: simulation study, CO2, water alternating gas injection, fractured reservoirs

Procedia PDF Downloads 277
14177 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.

Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element

Procedia PDF Downloads 51
14176 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 404
14175 Mediation in Turkey

Authors: Ibrahim Ercan, Mustafa Arikan

Abstract:

In recent years, alternative dispute resolution methods have attracted the attention of many country’s legislators. Instead of solving the disputes by litigation, putting the end to a dispute by parties themselves is more important for the preservation of social peace. Therefore, alternative dispute resolution methods (ADR) have been discussed more intensively in Turkey as well as the whole world. After these discussions, Mediation Act was adopted on 07.06.2012 and entered into force on 21.06.2013. According to the Mediation Act, it is only possible to mediate issues arising from the private law. Also, it is not compulsory to go to mediation in Turkish law, it is optional. Therefore, the parties are completely free to choose mediation method in dispute resolution. Mediators need to be a lawyer with experience in five years. Therefore, it is not possible to be a mediator who is not lawyers. Beyond five years of experience, getting education and success in exams about especially body language and psychology is also very important to be a mediator. If the parties compromise as a result of mediation, a document is issued. This document will also have the ability to exercising availability under certain circumstances. Thus, the parties will not need to apply to the court again. On the contrary, they will find the opportunity to execute this document, so they can regain their debts. However, the Mediation Act has entered into force in a period of nearly two years of history; it is possible to say that the interest in mediation is not at the expected level. Therefore, making mediation mandatory for some disputes has been discussed recently. At this point, once the mediation becomes mandatory and good results follows it, this institution will be able to find a serious interest in Turkey. Otherwise, if the results will not be satisfying, the mediation method will be removed.

Keywords: alternative dispute resolution methods, mediation act, mediation, mediator, mediation in Turkey

Procedia PDF Downloads 350
14174 Application of Adaptive Particle Filter for Localizing a Mobile Robot Using 3D Camera Data

Authors: Maysam Shahsavari, Seyed Jamalaldin Haddadi

Abstract:

There are several methods to localize a mobile robot such as relative, absolute and probabilistic. In this paper, particle filter due to its simple implementation and the fact that it does not need to know to the starting position will be used. This method estimates the position of the mobile robot using a probabilistic distribution, relying on a known map of the environment instead of predicting it. Afterwards, it updates this estimation by reading input sensors and control commands. To receive information from the surrounding world, distance to obstacles, for example, a Kinect is used which is much cheaper than a laser range finder. Finally, after explaining the Adaptive Particle Filter method and its implementation in detail, we will compare this method with the dead reckoning method and show that this method is much more suitable for situations in which we have a map of the environment.

Keywords: particle filter, localization, methods, odometry, kinect

Procedia PDF Downloads 248
14173 A Review Paper for Detecting Zero-Day Vulnerabilities

Authors: Tshegofatso Rambau, Tonderai Muchenje

Abstract:

Zero-day attacks (ZDA) are increasing day by day; there are many vulnerabilities in systems and software that date back decades. Companies keep discovering vulnerabilities in their systems and software and work to release patches and updates. A zero-day vulnerability is a software fault that is not widely known and is unknown to the vendor; attackers work very quickly to exploit these vulnerabilities. These are major security threats with a high success rate because businesses lack the essential safeguards to detect and prevent them. This study focuses on the factors and techniques that can help us detect zero-day attacks. There are various methods and techniques for detecting vulnerabilities. Various companies like edges can offer penetration testing and smart vulnerability management solutions. We will undertake literature studies on zero-day attacks and detection methods, as well as modeling approaches and simulations, as part of the study process.

Keywords: zero-day attacks, exploitation, vulnerabilities

Procedia PDF Downloads 86
14172 Evaluation of Microbiological Quality and Safety of Two Types of Salads Prepared at Libyan Airline Catering Center in Tripoli

Authors: Elham A. Kwildi, Yahia S. Abugnah, Nuri S. Madi

Abstract:

This study was designed to evaluate the microbiological quality and safety of two types of salads prepared at a catering center affiliated with Libyan Airlines in Tripoli, Libya. Two hundred and twenty-one (221) samples (132 economy-class and 89 first- class) were used in this project which lasted for ten months. Biweekly, microbiological tests were performed which included total plate count (TPC) and total coliforms (TCF), in addition to enumeration and/or detection of some pathogenic bacteria mainly Escherichia coli, Staphylococcus aureus, Bacillus cereus, Salmonella sp, Listeria sp and Vibrio parahaemolyticus parahaemolyticus, By using conventional as well as compact dry methods. Results indicated that TPC of type 1 salad ranged between (<10 – 62 x 103 cfu/gm) and (<10 to 36 x103 cfu/g), while TCF were (<10 – 41 x 103 cfu/gm) and (< 10 to 66 x102 cfu/g) using both methods of detection respectively. On the other hand, TPC of type 2 salad were: (1 × 10 – 52 x 103) and (<10 – 55 x 103 cfu/gm) and in the range of (1 x10 to 45x103 cfu/g), and the (TCF) counts were between (< 10 to 55x103 cfu/g) and (< 10 to 34 x103 cfu/g) using the 1st and the 2nd methods of detection respectively. Also, the pathogens mentioned above were detected in both types of salads, but their levels varied according to the type of salad and the method of detection. The level of Staphylococcus aureus, for instance, was 17.4% using conventional method versus 14.4% using the compact dry method. Similarly, E. coli was 7.6% and 9.8%, while Salmonella sp. recorded the least percentage i.e. 3% and 3.8% with the two mentioned methods respectively. First class salads were also found to contain the same pathogens, but the level of E. coli was relatively higher in this case (14.6% and 16.9%) using conventional and compact dry methods respectively. The second rank came Staphylococcus aureus (13.5%) and (11.2%), followed by Salmonella (6.74%) and 6.70%). The least percentage was for Vibrio parahaemolyticus (4.9%) which was detected in the first class salads only. The other two pathogens Bacillus cereus and Listeria sp. were not detected in either one of the salads. Finally, it is worth mentioning that there was a significant decline in TPC and TCF counts in addition to the disappearance of pathogenic bacteria after the 6-7th month of the study which coincided with the first trial of the HACCP system at the center. The ups and downs in the counts along the early stages of the study reveal that there is a need for some important correction measures including more emphasis on training of the personnel in applying the HACCP system effectively.

Keywords: air travel, vegetable salads, foodborne outbreaks, Libya

Procedia PDF Downloads 310
14171 Research on Tight Sandstone Oil Accumulation Process of the Third Member of Shahejie Formation in Dongpu Depression, China

Authors: Hui Li, Xiongqi Pang

Abstract:

In recent years, tight oil has become a hot spot for unconventional oil and gas exploration and development in the world. Dongpu Depression is a typical hydrocarbon-rich basin in the southwest of Bohai Bay Basin, in which tight sandstone oil and gas have been discovered in deep reservoirs, most of which are buried more than 3500m. The distribution and development characteristics of deep tight sandstone reservoirs need to be studied. The main source rocks in study area are dark mudstone and shale of the middle and lower third sub-member of Shahejie Formation. Total Organic Carbon (TOC) content of source rock is between 0.08-11.54%, generally higher than 0.6% and the value of S1+S2 is between 0.04–72.93 mg/g, generally higher than 2 mg/g. It can be evaluated as middle to fine level overall. The kerogen type of organic matter is predominantly typeⅡ1 andⅡ2. Vitrinite reflectance (Ro) is mostly greater than 0.6% indicating that the source rock entered the hydrocarbon generation threshold. The physical property of reservoir was poor, the most reservoir has a porosity lower than 12% and a permeability of less than 1×10⁻³μm. The rocks in this area showed great heterogeneity, some areas developed desserts with high porosity and permeability. According to SEM, thin section image, inclusion test and so on, the reservoir was affected by compaction and cementation during early diagenesis stage (44-31Ma). The diagenesis caused the tight reservoir in Huzhuangji, Pucheng, Weicheng Area while the porosity in Machang, Qiaokou, Wenliu Area was still over 12%. In the process of middle diagenesis phase stage A (31-17Ma), the reservoir porosity in Machang, Pucheng, Huzhuangji Area increased due to dissolution; after that the oil generation window of source rock was achieved for the first phase hydrocarbon charging (31-23Ma), formed the conventional oil deposition in Machang, Qiaokou, Wenliu, Huzhuangji Area and unconventional tight reservoir in Pucheng, Weicheng Area. Then came to stage B of middle diagenesis phase (17-7Ma), in this stage, the porosity of reservoir continued to decrease after the dissolution and led to a situation that the reservoirs were generally compacted. And since then, the second hydrocarbon filling has been processing since 7Ma. Most of the pools charged and formed in this procedure are tight sandstone oil reservoir. In conclusion, tight sandstone oil was formed in two patterns in Dongpu Depression, which could be concluded as ‘density fist then accumulation’ pattern and ‘accumulation fist next density’ pattern.

Keywords: accumulation process, diagenesis, dongpu depression, tight sandstone oil

Procedia PDF Downloads 105
14170 Computational Fluid Dynamics Simulation Study of Flow near Moving Wall of Various Surface Types Using Moving Mesh Method

Authors: Khizir Mohd Ismail, Yu Jun Lim, Tshun Howe Yong

Abstract:

The study of flow behavior in an enclosed volume using Computational Fluid Dynamics (CFD) has been around for decades. However, due to the knowledge limitation of adaptive grid methods, the flow in an enclosed volume near the moving wall using CFD is less explored. A CFD simulation of flow in an enclosed volume near a moving wall was demonstrated and studied by introducing a moving mesh method and was modeled with Unsteady Reynolds-Averaged Navier-Stokes (URANS) approach. A static enclosed volume with controlled opening size in the bottom was positioned against a moving, translational wall with sliding mesh features. Controlled variables such as smoothed, crevices and corrugated wall characteristics, the distance between the enclosed volume to the wall and the moving wall speed against the enclosed chamber were varied to understand how the flow behaves and reacts in between these two geometries. These model simulations were validated against experimental results and provided result confidence when the simulation had shown good agreement with the experimental data. This study had provided better insight into the flow behaving in an enclosed volume when various wall types in motion were introduced within the various distance between each other and create a potential opportunity of application which involves adaptive grid methods in CFD.

Keywords: moving wall, adaptive grid methods, CFD, moving mesh method

Procedia PDF Downloads 129
14169 Development of Cost-effective Sensitive Methods for Pathogen Detection in Community Wastewater for Disease Surveillance

Authors: Jesmin Akter, Chang Hyuk Ahn, Ilho Kim, Jaiyeop Lee

Abstract:

Global pandemic coronavirus disease (COVID-19) caused by Severe acute respiratory syndrome SARS-CoV-2, to control the spread of the COVID-19 pandemic, wastewater surveillance has been used to monitor SARS-CoV2 prevalence in the community. The challenging part is establishing wastewater surveillance; there is a need for a well-equipped laboratory for wastewater sample analysis. According to many previous studies, reverse transcription-polymerase chain reaction (RT-PCR) based molecular tests are the most widely used and popular detection method worldwide. However, the RT-qPCR based approaches for the detection or quantification of SARS-CoV-2 genetic fragments ribonucleic acid (RNA) from wastewater require a specialized laboratory, skilled personnel, expensive instruments, and a workflow that typically requires 6 to 8 hours to provide results for just minimum samples. Rapid and reliable alternative detection methods are needed to enable less-well-qualified practitioners to set up and provide sensitive detection of SARS-CoV-2 within wastewater at less-specialized regional laboratories. Therefore, scientists and researchers are conducting experiments for rapid detection methods of COVID-19; in some cases, the structural and molecular characteristics of SARS-CoV-2 are unknown, and various strategies for the correct diagnosis of COVID-19 have been proposed by research laboratories, which are presented in the present study. The ongoing research and development of these highly sensitive and rapid technologies, namely RT-LAMP, ELISA, Biosensors, GeneXpert, allows a wide range of potential options not only for SARS-CoV-2 detection but also for other viruses as well. The effort of this study is to discuss the above effective and regional rapid detection and quantification methods in community wastewater as an essential step in advancing scientific goals.

Keywords: rapid detection, SARS-CoV-2, sensitive detection, wastewater surveillance

Procedia PDF Downloads 67
14168 The Characterization and Optimization of Bio-Graphene Derived From Oil Palm Shell Through Slow Pyrolysis Environment and Its Electrical Conductivity and Capacitance Performance as Electrodes Materials in Fast Charging Supercapacitor Application

Authors: Nurhafizah Md. Disa, Nurhayati Binti Abdullah, Muhammad Rabie Bin Omar

Abstract:

This research intends to identify the existing knowledge gap because of the lack of substantial studies to fabricate and characterize bio-graphene created from Oil Palm Shell (OPS) through the means of pre-treatment and slow pyrolysis. By fabricating bio-graphene through OPS, a novel material can be found to procure and used for graphene-based research. The characterization of produced bio-graphene is intended to possess a unique hexagonal graphene pattern and graphene properties in comparison to other previously fabricated graphene. The OPS will be fabricated by pre-treatment of zinc chloride (ZnCl₂) and iron (III) chloride (FeCl3), which then induced the bio-graphene thermally by slow pyrolysis. The pyrolizer's final temperature and resident time will be set at 550 °C, 5/min, and 1 hour respectively. Finally, the charred product will be washed with hydrochloric acid (HCL) to remove metal residue. The obtained bio-graphene will undergo different analyses to investigate the physicochemical properties of the two-dimensional layer of carbon atoms with sp2 hybridization hexagonal lattice structure. The analysis that will be taking place is Raman Spectroscopy (RAMAN), UV-visible spectroscopy (UV-VIS), Transmission Electron Microscopy (TEM), Scanning Electron Microscopy (SEM), and X-Ray Diffraction (XRD). In retrospect, RAMAN is used to analyze three key peaks found in graphene, namely D, G, and 2D peaks, which will evaluate the quality of the bio-graphene structure and the number of layers generated. To compare and strengthen graphene layer resolves, UV-VIS may be used to establish similar results of graphene layer from last layer analysis and also characterize the types of graphene procured. A clear physical image of graphene can be obtained by analyzation of TEM in order to study structural quality and layers condition and SEM in order to study the surface quality and repeating porosity pattern. Lastly, establishing the crystallinity of the produced bio-graphene, simultaneously as an oxygen contamination factor and thus pristineness of the graphene can be done by XRD. In the conclusion of this paper, this study is able to obtain bio-graphene through OPS as a novel material in pre-treatment by chloride ZnCl₂ and FeCl3 and slow pyrolization to provide a characterization analysis related to bio-graphene that will be beneficial for future graphene-related applications. The characterization should yield similar findings to previous papers as to confirm graphene quality.

Keywords: oil palm shell, bio-graphene, pre-treatment, slow pyrolysis

Procedia PDF Downloads 65
14167 Molecular Biomonitoring of Bacterial Pathogens in Wastewater

Authors: Desouky Abd El Haleem, Sahar Zaki

Abstract:

This work was conducted to develop a one-step multiplex PCR system for rapid, sensitive, and specific detection of three different bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, and Salmonella spp, directly in wastewater without prior isolation on selective media. As a molecular confirmatory test after isolation of the pathogens by classical microbiological methods, PCR-RFLP of their amplified 16S rDNA genes was performed. It was observed that the developed protocols have significance impact in the ability to detect sensitively, rapidly and specifically the three pathogens directly in water within short-time, represents a considerable advancement over more time-consuming and less-sensitive methods for identification and characterization of these kinds of pathogens.

Keywords: multiplex PCR, bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, Salmonella spp.

Procedia PDF Downloads 433
14166 Investigation of Long-Term Thermal Insulation Performance of Vacuum Insulation Panels with Various Enveloping Methods

Authors: Inseok Yeo, Tae-Ho Song

Abstract:

To practically apply vacuum insulation panels (VIPs) to buildings or home appliances, VIPs have demanded long-term lifespan with outstanding insulation performance. Service lives of VIPs enveloped with Al-foil and three-layer Al-metallized envelope are calculated. For Al-foil envelope, the service life is longer but edge conduction is too large compared with the Al metallized envelope. To increase service life even more, the proposed double enveloping method and metal-barrier-added enveloping method are further analyzed. The service lives of the VIP to employ two enveloping methods are calculated. Also, pressure increase and thermal insulation performance characteristics are investigated. For the metal- barrier-added enveloping method, effective thermal conductivity increase with time is close to that of Al-foil envelope, especially, for getter-inserted VIPs. For the double enveloping method, if water vapor is perfectly adsorbed, the effect of service life enhancement becomes much greater. From these methods, the VIP can be guaranteed for the service life of more than 20 years.

Keywords: vacuum insulation panels, service life, double enveloping, metal-barrier-added enveloping, edge conduction

Procedia PDF Downloads 416
14165 Comparison of Finite-Element and IEC Methods for Cable Thermal Analysis under Various Operating Environments

Authors: M. S. Baazzim, M. S. Al-Saud, M. A. El-Kady

Abstract:

In this paper, steady-state ampacity (current carrying capacity) evaluation of underground power cable system by using analytical and numerical methods for different conditions (depth of cable, spacing between phases, soil thermal resistivity, ambient temperature, wind speed), for two system voltage level were used 132 and 380 kV. The analytical method or traditional method that was used is based on the thermal analysis method developed by Neher-McGrath and further enhanced by International Electrotechnical Commission (IEC) and published in standard IEC 60287. The numerical method that was used is finite element method and it was recourse commercial software based on finite element method.

Keywords: cable ampacity, finite element method, underground cable, thermal rating

Procedia PDF Downloads 361
14164 Survey of Methods for Solutions of Spatial Covariance Structures and Their Limitations

Authors: Joseph Thomas Eghwerido, Julian I. Mbegbu

Abstract:

In modelling environment processes, we apply multidisciplinary knowledge to explain, explore and predict the Earth's response to natural human-induced environmental changes. Thus, the analysis of spatial-time ecological and environmental studies, the spatial parameters of interest are always heterogeneous. This often negates the assumption of stationarity. Hence, the dispersion of the transportation of atmospheric pollutants, landscape or topographic effect, weather patterns depends on a good estimate of spatial covariance. The generalized linear mixed model, although linear in the expected value parameters, its likelihood varies nonlinearly as a function of the covariance parameters. As a consequence, computing estimates for a linear mixed model requires the iterative solution of a system of simultaneous nonlinear equations. In other to predict the variables at unsampled locations, we need to know the estimate of the present sampled variables. The geostatistical methods for solving this spatial problem assume covariance stationarity (locally defined covariance) and uniform in space; which is not apparently valid because spatial processes often exhibit nonstationary covariance. Hence, they have globally defined covariance. We shall consider different existing methods of solutions of spatial covariance of a space-time processes at unsampled locations. This stationary covariance changes with locations for multiple time set with some asymptotic properties.

Keywords: parametric, nonstationary, Kernel, Kriging

Procedia PDF Downloads 238