Search results for: subjective evaluation.
155 Elaboration and Validation of a Survey about Research on the Characteristics of Mentoring of University Professors’ Lifelong Learning
Authors: Nagore Guerra Bilbao, Clemente Lobato Fraile
Abstract:
This paper outlines the design and development of the MENDEPRO questionnaire, designed to analyze mentoring performance within a professional development process carried out with professors at the University of the Basque Country, Spain. The study took into account the international research carried out over the past two decades into teachers' professional development, and was also based on a thorough review of the most common instruments used to identify and analyze mentoring styles, many of which fail to provide sufficient psychometric guarantees. The present study aimed to gather empirical data in order to verify the metric quality of the questionnaire developed. To this end, the process followed to validate the theoretical construct was as follows: The formulation of the items and indicators in accordance with the study variables; the analysis of the validity and reliability of the initial questionnaire; the review of the second version of the questionnaire and the definitive measurement instrument. Content was validated through the formal agreement and consensus of 12 university professor training experts. A reduced sample of professors who had participated in a lifelong learning program was then selected for a trial evaluation of the instrument developed. After the trial, 18 items were removed from the initial questionnaire. The final version of the instrument, comprising 33 items, was then administered to a sample group of 99 participants. The results revealed a five-dimensional structure matching theoretical expectations. Also, the reliability data for both the instrument as a whole (.98) and its various dimensions (between .91 and .97) were very high. The questionnaire was thus found to have satisfactory psychometric properties and can therefore be considered apt for studying the performance of mentoring in both induction programs for young professors and lifelong learning programs for senior faculty members.
Keywords: Higher education, mentoring, professional development, university teachers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842154 A Damage Level Assessment Model for Extra High Voltage Transmission Towers
Authors: Huan-Chieh Chiu, Hung-Shuo Wu, Chien-Hao Wang, Yu-Cheng Yang, Ching-Ya Tseng, Joe-Air Jiang
Abstract:
Power failure resulting from tower collapse due to violent seismic events might bring enormous and inestimable losses. The Chi-Chi earthquake, for example, strongly struck Taiwan and caused huge damage to the power system on September 21, 1999. Nearly 10% of extra high voltage (EHV) transmission towers were damaged in the earthquake. Therefore, seismic hazards of EHV transmission towers should be monitored and evaluated. The ultimate goal of this study is to establish a damage level assessment model for EHV transmission towers. The data of earthquakes provided by Taiwan Central Weather Bureau serve as a reference and then lay the foundation for earthquake simulations and analyses afterward. Some parameters related to the damage level of each point of an EHV tower are simulated and analyzed by the data from monitoring stations once an earthquake occurs. Through the Fourier transform, the seismic wave is then analyzed and transformed into different wave frequencies, and the data would be shown through a response spectrum. With this method, the seismic frequency which damages EHV towers the most is clearly identified. An estimation model is built to determine the damage level caused by a future seismic event. Finally, instead of relying on visual observation done by inspectors, the proposed model can provide a power company with the damage information of a transmission tower. Using the model, manpower required by visual observation can be reduced, and the accuracy of the damage level estimation can be substantially improved. Such a model is greatly useful for health and construction monitoring because of the advantages of long-term evaluation of structural characteristics and long-term damage detection.Keywords: Smart grid, EHV transmission tower, response spectrum, damage level monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1066153 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds
Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi
Abstract:
Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.Keywords: Sensors, endocrine disruptors, nanoparticles, electrochemical, microscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576152 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.
Keywords: Anti-spoofing, CNN, fingerprint recognition, loss function, optimizer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 420151 Rheological Properties of Dough and Sensory Quality of Crackers with Dietary Fibers
Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Zita Šereš, Biljana Pajin, Nils Juul, Nikola Maravić
Abstract:
The possibility of application the dietary fibers in production of crackers was observed in this work, as well as their influence on rheological and textural properties on the dough for crackers and influence on sensory properties of obtained crackers. Three different dietary fibers, oat, potato and pea fibers, replaced 10% of wheat flour. Long fermentation process and baking test method were used for crackers production. The changes of dough for crackers were observed by rheological methods of determination the viscoelastic dough properties and by textural measurements. Sensory quality of obtained crackers was described using quantity descriptive method (QDA) by trained members of descriptive panel. Additional analysis of crackers surface was performed by videometer. Based on rheological determination, viscoelastic properties of dough for crackers were reduced by application of dietary fibers. Manipulation of dough with 10% of potato fiber was disabled, thus the recipe modification included increase in water content at 35%. Dough compliance to constant stress for samples with dietary fibers decreased, due to more rigid and stiffer dough consistency compared to control sample. Also, hardness of dough for these samples increased and dough extensibility decreased. Sensory properties of final products, crackers, were reduced compared to control sample. Application of dietary fibers affected mostly hardness, structure and crispness of the crackers. Observed crackers were low marked for flavor and taste, due to influence of fibers specific aroma. The sample with 10% of potato fibers and increased water content was the most adaptable to applied stresses and to production process. Also this sample was close to control sample without dietary fibers by evaluation of sensory properties and by results of videometer method.Keywords: Crackers, dietary fibers, rheology, sensory properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2494150 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc
Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez
Abstract:
The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.
Keywords: BLER, LTE, Network, Qualipoc, SNR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 535149 Alignment of e-Government Policy Formulation with Practical Implementation: The Case of Sub-Saharan Africa
Authors: W. Munyoka, F. M. Manzira
Abstract:
The purpose of this study is to analyze how varying alignment of e-Government policies in four countries in Sub-Saharan Africa Region, namely South Africa, Seychelles, Mauritius and Cape Verde lead to the success or failure of e-Government; and what should be done to ensure positive alignment that lead to e-Government project growth. In addition, the study aims to understand how various governments’ efforts in e-Government awareness campaign strategies, international cooperation, functional literacy and anticipated organizational change can influence implementation.
This study extensively explores contemporary research undertaken in the field of e-Government and explores the actual respective national ICT policies, strategies and implemented e-Government projects for in-depth comprehension of the status core. Data is analyzed qualitatively and quantitatively to reach a conclusion.
The study found that resounding successes in strategic e-Government alignment was achieved in Seychelles, Mauritius, South Africa and Cape Verde - (Ranked number 1 to 4 respectively).
The implications of the study is that policy makers in developing countries should put mechanisms in place for constant monitoring and evaluation of project implementation in line with ICT policies to ensure that e-Government projects reach maturity levels and do not die mid-way implementation as often noticed in many countries. The study recommends that countries within the region should make consented collaborative efforts and synergies with the private sector players and international donor agencies to achieve the implementation part of the set ICT policies.
Keywords: E-Government, ICT-Policy Alignment, Implementation, Sub-Saharan Africa.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2340148 Seismic Fragility Assessment of Strongback Steel Braced Frames Subjected to Near-Field Earthquakes
Authors: Mohammadreza Salek Faramarzi, Touraj Taghikhany
Abstract:
In this paper, seismic fragility assessment of a recently developed hybrid structural system, known as the strongback system (SBS) is investigated. In this system, to mitigate the occurrence of the soft-story mechanism and improve the distribution of story drifts over the height of the structure, an elastic vertical truss is formed. The strengthened members of the braced span are designed to remain substantially elastic during levels of excitation where soft-story mechanisms are likely to occur and impose a nearly uniform story drift distribution. Due to the distinctive characteristics of near-field ground motions, it seems to be necessary to study the effect of these records on seismic performance of the SBS. To this end, a set of 56 near-field ground motion records suggested by FEMA P695 methodology is used. For fragility assessment, nonlinear dynamic analyses are carried out in OpenSEES based on the recommended procedure in HAZUS technical manual. Four damage states including slight, moderate, extensive, and complete damage (collapse) are considered. To evaluate each damage state, inter-story drift ratio and floor acceleration are implemented as engineering demand parameters. Further, to extend the evaluation of the collapse state of the system, a different collapse criterion suggested in FEMA P695 is applied. It is concluded that SBS can significantly increase the collapse capacity and consequently decrease the collapse risk of the structure during its life time. Comparing the observing mean annual frequency (MAF) of exceedance of each damage state against the allowable values presented in performance-based design methods, it is found that using the elastic vertical truss, improves the structural response effectively.
Keywords: Strongback System, Near-fault, Seismic fragility, Uncertainty, IDA, Probabilistic performance assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 576147 Quality Evaluation of Grape Seed Oils of the Ionian Islands Based on GC-MS and Other Spectroscopic Techniques
Authors: I. Oikonomou, I. Lappa, D. Daferera, C. Kanakis, L. Kiokakis, K. Skordilis, A. Avramouli, E. Kalli, C. Pappas, P. A. Tarantilis, E. Skotti
Abstract:
Grape seeds are waste products of wineries and often referred to as an important agricultural and industrial waste product with the potential to be used in pharmaceutical, food, and cosmetic applications. In this study, grape seed oil from traditional Ionian varieties was examined for the determination of the quality and the characteristics of each variety. Initially, the fatty acid methyl ester (FAME) profiles were analyzed using Gas Chromatography-Mass Spectrometry, after transesterification. Furthermore, other quality parameters of the grape seed oils were determined by Spectroscopy techniques, UV-Vis and Raman included. Moreover, the antioxidant capacity of the oil was measured by 2,2'-azino-bis-3-ethylbenzothiazoline-6-sulfonic acid (ABTS) and 2,2-Diphenyl-1-picrylhydrazyl (DPPH) assays and their antioxidant capacity expressed in Trolox equivalents. K and ΔΚ indices were measured in 232, 268, 270 nm, as an oil quality index. The results indicate that the air-dried grape seed total oil content ranged from 5.26 to 8.77% w/w, which is in accordance with the other grape seed varieties tested in similar studies. The composition of grape seed oil is predominated with linoleic and oleic fatty acids, with the linoleic fatty acid ranging from 53.68 to 69.95% and both the linoleic and oleic fatty acids totaling 78-82% of FAMEs, which is analogous to the fatty acid composition of safflower oil. The antioxidant assays ABTS and DPPH scored high, exhibiting that the oils have potential in the cosmetic and culinary businesses. Above that, our results demonstrate that Ionian grape seed oils have prospects that can go further than cosmetic or culinary use, into the pharmaceuticals industry. Finally, the reclamation of grape seeds from wineries waste stream is in accordance with the bio-economy strategic framework and contributes to environmental protection.
Keywords: Antioxidant capacity, fatty acid methyl esters, grape seed oil, GC-MS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 747146 An Evaluation on the Effectiveness of a 3D Printed Composite Compression Mold
Authors: Peng Hao Wang, Garam Kim, Ronald Sterkenburg
Abstract:
The applications of composite materials within the aviation industry has been increasing at a rapid pace. However, the growing applications of composite materials have also led to growing demand for more tooling to support its manufacturing processes. Tooling and tooling maintenance represents a large portion of the composite manufacturing process and cost. Therefore, the industry’s adaptability to new techniques for fabricating high quality tools quickly and inexpensively will play a crucial role in composite material’s growing popularity in the aviation industry. One popular tool fabrication technique currently being developed involves additive manufacturing such as 3D printing. Although additive manufacturing and 3D printing are not entirely new concepts, the technique has been gaining popularity due to its ability to quickly fabricate components, maintain low material waste, and low cost. In this study, a team of Purdue University School of Aviation and Transportation Technology (SATT) faculty and students investigated the effectiveness of a 3D printed composite compression mold. A 3D printed composite compression mold was fabricated by 3D scanning a steel valve cover of an aircraft reciprocating engine. The 3D printed composite compression mold was used to fabricate carbon fiber versions of the aircraft reciprocating engine valve cover. The 3D printed composite compression mold was evaluated for its performance, durability, and dimensional stability while the fabricated carbon fiber valve covers were evaluated for its accuracy and quality. The results and data gathered from this study will determine the effectiveness of the 3D printed composite compression mold in a mass production environment and provide valuable information for future understanding, improvements, and design considerations of 3D printed composite molds.
Keywords: Additive manufacturing, carbon fiber, composite tooling, molds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 708145 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kr. Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with specific focus on Infrared (IR) and Visible image (VI) fusion for various applications including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like Visible camera & IR Thermal Imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (IR) that may be reflected or self-emitted. A digital color camera captures the visible source image and a thermal IR camera acquires the thermal source image. In this paper, some image fusion algorithms based upon Multi-Scale Transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, but they also make it hard to become deployed in system and applications that require real-time operation, high flexibility and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.
Keywords: Image fusion, IR thermal imager, multi-sensor, Multi-Scale Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430144 Development of Wave-Dissipating Block Installation Simulation for Inexperienced Worker Training
Authors: Hao Min Chuah, Tatsuya Yamazaki, Ryosui Iwasawa, Tatsumi Suto
Abstract:
In recent years, with the advancement of digital technology, the movement to introduce so-called ICT (Information and Communication Technology), such as computer technology and network technology, to civil engineering construction sites and construction sites is accelerating. As part of this movement, attempts are being made in various situations to reproduce actual sites inside computers and use them for designing and construction planning, as well as for training inexperienced engineers. The installation of wave-dissipating blocks on coasts, etc., is a type of work that has been carried out by skilled workers based on their years of experience and is one of the tasks that is difficult for inexperienced workers to carry out on site. Wave-dissipating blocks are structures that are designed to protect coasts, beaches, and so on from erosion by reducing the energy of ocean waves. Wave-dissipating blocks usually weigh more than 1 t and are installed by being suspended by a crane, so it would be time-consuming and costly for inexperienced workers to train on-site. In this paper, therefore, a block installation simulator is developed based on Unity 3D, a game development engine. The simulator computes porosity. Porosity is defined as the ratio of the total volume of the wave breaker blocks inside the structure to the final shape of the ideal structure. Using the evaluation of porosity, the simulator can determine how well the user is able to install the blocks. The voxelization technique is used to calculate the porosity of the structure, simplifying the calculations. Other techniques, such as raycasting and box overlapping, are employed for accurate simulation. In the near future, the simulator will install an automatic block installation algorithm based on combinatorial optimization solutions and compare the user-demonstrated block installation and the appropriate installation solved by the algorithm.
Keywords: 3D simulator, porosity, user interface, voxelization, wave-dissipating blocks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 69143 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: Food (Ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500142 Evaluating Generative Neural Attention Weights-Based Chatbot on Customer Support Twitter Dataset
Authors: Sinarwati Mohamad Suhaili, Naomie Salim, Mohamad Nazim Jambli
Abstract:
Sequence-to-sequence (seq2seq) models augmented with attention mechanisms are increasingly important in automated customer service. These models, adept at recognizing complex relationships between input and output sequences, are essential for optimizing chatbot responses. Central to these mechanisms are neural attention weights that determine the model’s focus during sequence generation. Despite their widespread use, there remains a gap in the comparative analysis of different attention weighting functions within seq2seq models, particularly in the context of chatbots utilizing the Customer Support Twitter (CST) dataset. This study addresses this gap by evaluating four distinct attention-scoring functions—dot, multiplicative/general, additive, and an extended multiplicative function with a tanh activation parameter — in neural generative seq2seq models. Using the CST dataset, these models were trained and evaluated over 10 epochs with the AdamW optimizer. Evaluation criteria included validation loss and BLEU scores implemented under both greedy and beam search strategies with a beam size of k = 3. Results indicate that the model with the tanh-augmented multiplicative function significantly outperforms its counterparts, achieving the lowest validation loss (1.136484) and the highest BLEU scores (0.438926 under greedy search, 0.443000 under beam search, k = 3). These findings emphasize the crucial influence of selecting an appropriate attention-scoring function to enhance the performance of seq2seq models for chatbots, particularly highlighting the model integrating tanh activation as a promising approach to improving chatbot quality in customer support contexts.
Keywords: Attention weight, chatbot, encoder-decoder, neural generative attention, score function, sequence-to-sequence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91141 An Ergonomic Evaluation of Three Load Carriage Systems for Reducing Muscle Activity of Trunk and Lower Extremities during Giant Puppet Performing Tasks
Authors: Cathy SW. Chow, Kristina Shin, Faming Wang, B. C. L. So
Abstract:
During some dynamic giant puppet performances, an ergonomically designed load carrier system is necessary for the puppeteers to carry a giant puppet body’s heavy load with minimum muscle stress. A load carrier (i.e. prototype) was designed with two small wheels on the foot; and a hybrid spring device on the knee in order to assist the sliding and knee bending movements respectively. Thus, the purpose of this study was to evaluate the effect of three load carriers including two other commercially available load mounting systems, Tepex and SuitX, and the prototype. Ten male participants were recruited for the experiment. Surface electromyography (sEMG) was used to collect the participants’ muscle activities during forward moving and bouncing and with and without load of 11.1 kg that was 60 cm above the shoulder. Five bilateral muscles including the lumbar erector spinae (LES), rectus femoris (RF), bicep femoris (BF), tibialis anterior (TA), and gastrocnemius (GM) were selected for data collection. During forward moving task, the sEMG data showed smallest muscle activities by Tepex harness which exhibited consistently the lowest, compared with the prototype and SuitX which were significantly higher on left LES 68.99% and 64.99%, right LES 26.57% and 82.45%; left RF 87.71% and 47.61%, right RF 143.57% and 24.28%; left BF 80.21% and 22.23%, right BF 96.02% and 21.83%; right TA 6.32% and 4.47%; left GM 5.89% and 12.35% respectively. The result above reflected mobility was highly restricted by tested exoskeleton devices. On the other hand, the sEMG data from bouncing task showed the smallest muscle activities by prototype which exhibited consistently the lowest, compared with the Tepex harness and SuitX which were significantly lower on lLES 6.65% and 104.93, rLES 23.56% and 92.19%; lBF 33.21% and 93.26% and rBF 24.70% and 81.16%; lTA 46.51% and 191.02%; rTA 12.75% and 125.76%; IGM 31.54% and 68.36%; rGM 95.95% and 96.43% respectively.Keywords: Exoskeleton, load carriage aid, giant puppet performers, electromyography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 554140 FEM Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli
Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha
Abstract:
Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in fourpoint bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.
Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218139 Comparison of Composite Programming and Compromise Programming for Aircraft Selection Problem Using Multiple Criteria Decision Making Analysis Method
Authors: C. Ardil
Abstract:
In this paper, the comparison of composite programming and compromise programming for the aircraft selection problem is discussed using the multiple criteria decision analysis method. The decision making process requires the prior definition and fulfillment of certain factors, especially when it comes to complex areas such as aircraft selection problems. The proposed technique gives more efficient results by extending the composite programming and compromise programming, which are widely used in modeling multiple criteria decisions. The proposed model is applied to a practical decision problem for evaluating and selecting aircraft problems.A selection of aircraft was made based on the proposed approach developed in the field of multiple criteria decision making. The model presented is solved by using the following methods: composite programming, and compromise programming. The importance values of the weight coefficients of the criteria are calculated using the mean weight method. The evaluation and ranking of aircraft are carried out using the composite programming and compromise programming methods. In order to determine the stability of the model and the ability to apply the developed composite programming and compromise programming approach, the paper analyzes its sensitivity, which involves changing the value of the coefficient λ and q in the first part. The second part of the sensitivity analysis relates to the application of different multiple criteria decision making methods, composite programming and compromise programming. In addition, in the third part of the sensitivity analysis, the Spearman correlation coefficient of the ranks obtained was calculated which confirms the applicability of all the proposed approaches.
Keywords: composite programming, compromise programming, additive weighted model, multiplicative weighted model, multiple criteria decision making analysis, MCDMA, aircraft selection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 695138 Screening of Factors Affecting the Enzymatic Hydrolysis of Empty Fruit Bunches in Aqueous Ionic Liquid and Locally Produced Cellulase System
Authors: Md. Z. Alam, Amal A. Elgharbawy, Muhammad Moniruzzaman, Nassereldeen A. Kabbashi, Parveen Jamal
Abstract:
The enzymatic hydrolysis of lignocellulosic biomass is one of the obstacles in the process of sugar production, due to the presence of lignin that protects the cellulose molecules against cellulases. Although the pretreatment of lignocellulose in ionic liquid (IL) system has been receiving a lot of interest; however, it requires IL removal with an anti-solvent in order to proceed with the enzymatic hydrolysis. At this point, introducing a compatible cellulase enzyme seems more efficient in this process. A cellulase enzyme that was produced by Trichoderma reesei on palm kernel cake (PKC) exhibited a promising stability in several ILs. The enzyme called PKC-Cel was tested for its optimum pH and temperature as well as its molecular weight. One among evaluated ILs, 1,3-diethylimidazolium dimethyl phosphate [DEMIM] DMP was applied in this study. Evaluation of six factors was executed in Stat-Ease Design Expert V.9, definitive screening design, which are IL/ buffer ratio, temperature, hydrolysis retention time, biomass loading, cellulase loading and empty fruit bunches (EFB) particle size. According to the obtained data, IL-enzyme system shows the highest sugar concentration at 70 °C, 27 hours, 10% IL-buffer, 35% biomass loading, 60 Units/g cellulase and 200 μm particle size. As concluded from the obtained data, not only the PKC-Cel was stable in the presence of the IL, also it was actually stable at a higher temperature than its optimum one. The reducing sugar obtained was 53.468±4.58 g/L which was equivalent to 0.3055 g reducing sugar/g EFB. This approach opens an insight for more studies in order to understand the actual effect of ILs on cellulases and their interactions in the aqueous system. It could also benefit in an efficient production of bioethanol from lignocellulosic biomass.Keywords: Cellulase, hydrolysis, lignocellulose, pretreatment, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487137 Named Entity Recognition using Support Vector Machine: A Language Independent Approach
Authors: Asif Ekbal, Sivaji Bandyopadhyay
Abstract:
Named Entity Recognition (NER) aims to classify each word of a document into predefined target named entity classes and is now-a-days considered to be fundamental for many Natural Language Processing (NLP) tasks such as information retrieval, machine translation, information extraction, question answering systems and others. This paper reports about the development of a NER system for Bengali and Hindi using Support Vector Machine (SVM). Though this state of the art machine learning technique has been widely applied to NER in several well-studied languages, the use of this technique to Indian languages (ILs) is very new. The system makes use of the different contextual information of the words along with the variety of features that are helpful in predicting the four different named (NE) classes, such as Person name, Location name, Organization name and Miscellaneous name. We have used the annotated corpora of 122,467 tokens of Bengali and 502,974 tokens of Hindi tagged with the twelve different NE classes 1, defined as part of the IJCNLP-08 NER Shared Task for South and South East Asian Languages (SSEAL) 2. In addition, we have manually annotated 150K wordforms of the Bengali news corpus, developed from the web-archive of a leading Bengali newspaper. We have also developed an unsupervised algorithm in order to generate the lexical context patterns from a part of the unlabeled Bengali news corpus. Lexical patterns have been used as the features of SVM in order to improve the system performance. The NER system has been tested with the gold standard test sets of 35K, and 60K tokens for Bengali, and Hindi, respectively. Evaluation results have demonstrated the recall, precision, and f-score values of 88.61%, 80.12%, and 84.15%, respectively, for Bengali and 80.23%, 74.34%, and 77.17%, respectively, for Hindi. Results show the improvement in the f-score by 5.13% with the use of context patterns. Statistical analysis, ANOVA is also performed to compare the performance of the proposed NER system with that of the existing HMM based system for both the languages.
Keywords: Named Entity (NE), Named Entity Recognition (NER), Support Vector Machine (SVM), Bengali, Hindi.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3404136 Sedimentary Response to Coastal Defense Works in São Vicente Bay, São Paulo
Authors: L. C. Ansanelli, P. Alfredini
Abstract:
The article presents the evaluation of the effectiveness of two groins located at Gonzaguinha and Milionários Beaches, situated on the southeast coast of Brazil. The effectiveness of these coastal defense structures is evaluated in terms of sedimentary dynamics, which is one of the most important environmental processes to be assessed in coastal engineering studies. The applied method is based on the implementation of the Delft3D numerical model system tools. Delft3D-WAVE module was used for waves modelling, Delft3D-FLOW for hydrodynamic modelling and Delft3D-SED for sediment transport modelling. The calibration of the models was carried out in a way that the simulations adequately represent the region studied, evaluating improvements in the model elements with the use of statistical comparisons of similarity between the results and waves, currents and tides data recorded in the study area. Analysis of the maximum wave heights was carried to select the months with higher accumulated energy to implement these conditions in the engineering scenarios. The engineering studies were performed for two scenarios: 1) numerical simulation of the area considering only the two existing groins; 2) conception of breakwaters coupled at the ends of the existing groins, resulting in two “T” shaped structures. The sediment model showed that, for the simulated period, the area is affected by erosive processes and that the existing groins have little effectiveness in defending the coast in question. The implemented T structures showed some effectiveness in protecting the beaches against erosion and provided the recovery of the portion directly covered by it on the Milionários Beach. In order to complement this study, it is suggested the conception of further engineering scenarios that might recover other areas of the studied region.
Keywords: Coastal engineering, coastal erosion, Sao Vicente Bay, Delft3D, coastal engineering works.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 768135 Questions Categorization in E-Learning Environment Using Data Mining Technique
Authors: Vilas P. Mahatme, K. K. Bhoyar
Abstract:
Nowadays, education cannot be imagined without digital technologies. It broadens the horizons of teaching learning processes. Several universities are offering online courses. For evaluation purpose, e-examination systems are being widely adopted in academic environments. Multiple-choice tests are extremely popular. Moving away from traditional examinations to e-examination, Moodle as Learning Management Systems (LMS) is being used. Moodle logs every click that students make for attempting and navigational purposes in e-examination. Data mining has been applied in various domains including retail sales, bioinformatics. In recent years, there has been increasing interest in the use of data mining in e-learning environment. It has been applied to discover, extract, and evaluate parameters related to student’s learning performance. The combination of data mining and e-learning is still in its babyhood. Log data generated by the students during online examination can be used to discover knowledge with the help of data mining techniques. In web based applications, number of right and wrong answers of the test result is not sufficient to assess and evaluate the student’s performance. So, assessment techniques must be intelligent enough. If student cannot answer the question asked by the instructor then some easier question can be asked. Otherwise, more difficult question can be post on similar topic. To do so, it is necessary to identify difficulty level of the questions. Proposed work concentrate on the same issue. Data mining techniques in specific clustering is used in this work. This method decide difficulty levels of the question and categories them as tough, easy or moderate and later this will be served to the desire students based on their performance. Proposed experiment categories the question set and also group the students based on their performance in examination. This will help the instructor to guide the students more specifically. In short mined knowledge helps to support, guide, facilitate and enhance learning as a whole.Keywords: Data mining, e-examination, e-learning, moodle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075134 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach
Authors: Helen L. Hein, Joachim Schwarte
Abstract:
As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.
Keywords: Aerogel-based, insulating material, early develop¬ment phase, interval arithmetic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 615133 Molecular and Serological Diagnosis of Newcastle and Ornithobacterium rhinotracheale Broiler in Chicken in Fars Province, Iran
Authors: Mohammadjavad Mehrabanpour, Maryam Ranjbar Bushehri, Dorsa Mehrabanpour
Abstract:
Respiratory diseases are the most important problems in the country’s poultry industry, particularly when it comes to broiler flocks. Ornithobacterium rhinotracheale (ORT) is a species that causes poor performance in growth rate, egg production, and mortality. This pathogen causes a respiratory infection including pulmonary alveolar inflammation, and pneumonia of birds throughout the world. Newcastle disease (ND) is a highly contagious disease in poultry, and also, it causes considerable losses to the poultry industry. The aim of this study was to evaluate the simultaneous occurrence of ORT and ND and NDV isolation by inoculation in embryonated eggs and confirmed by RT-PCR in broiler chicken flocks in Fars province. In this study, 318 blood and 85 tissue samples (brain, trachea, liver, and cecal tonsils) were collected from 15 broiler chicken farms. Survey serum antibody titers against ORT by using a commercial enzyme-linked immunosorbent assay (ELISA) kit performed. Evaluation of antibody titer against ND virus is performed by hemagglutination inhibition test. Virus isolation with chick embryo eggs 9-11 and RT-PCR method were carried out. A total of 318 serum samples, 135 samples (42.5%) were positive for antibodies to ORT and titer of HI antibodies against NDV in 122 serum samples (38/4%) were 7-10 (log2) and 61 serum samples (19/2%) had occurrence antibody titer against Newcastle virus and ORT. Results of the present study indicated that 20 tissue samples were positive in embryonated egg and in rapid hemagglutination (HA) test. HI test with specific ND positive serum confirmed that 6 of 20 samples. PCR confirmed that all six samples were positive and PCR products of samples indicated 535-base pair fragments in electrophrosis. Due to the great economic importance of these two diseases in the poultry industry, it is necessary to design and implement a comprehensive plan for prevention and control of these diseases.
Keywords: ELISA, Newcastle disease, Ornithobacterium rhinotracheale, seroprevalence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1410132 Modern Seismic Design Approach for Buildings with Hysteretic Dampers
Authors: Vanessa A. Segovia, Sonia E. Ruiz
Abstract:
The use of energy dissipation systems for seismic applications has increased worldwide, thus it is necessary to develop practical and modern criteria for their optimal design. Here, a direct displacement-based seismic design approach for frame buildings with hysteretic energy dissipation systems (HEDS) is applied. The building is constituted by two individual structural systems consisting of: 1) a main elastic structural frame designed for service loads; and 2) a secondary system, corresponding to the HEDS, that controls the effects of lateral loads. The procedure implies to control two design parameters: a) the stiffness ratio (α=Kframe/Ktotal system), and b) the strength ratio (γ=Vdamper/Vtotal system). The proposed damage-controlled approach contributes to the design of a more sustainable and resilient building because the structural damage is concentrated on the HEDS. The reduction of the design displacement spectrum is done by means of a damping factor (recently published) for elastic structural systems with HEDS, located in Mexico City. Two limit states are verified: serviceability and near collapse. Instead of the traditional trial-error approach, a procedure that allows the designer to establish the preliminary sizes of the structural elements of both systems is proposed. The design methodology is applied to an 8-story steel building with buckling restrained braces, located in soft soil of Mexico City. With the aim of choosing the optimal design parameters, a parametric study is developed considering different values of હ and . The simplified methodology is for preliminary sizing, design, and evaluation of the effectiveness of HEDS, and it constitutes a modern and practical tool that enables the structural designer to select the best design parameters.
Keywords: Damage-controlled buildings, direct displacementbased seismic design, optimal hysteretic energy dissipation systems
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2354131 Prediction Study of a Corroded Pressure Vessel Using Evaluation Measurements and Finite Element Analysis
Authors: Ganbat Danaa, Chuluundorj Puntsag
Abstract:
The steel structures of the Oyu-Tolgoi mining Concentrator plant are corroded during operation, which raises doubts about the continued use of some important structures of the plant, which is one of the problems facing the plant's regular operation. As a part of the main operation of the plant, the bottom part of the pressure vessel, which plays an important role in the reliable operation of the concentrate filter-drying unit, was heavily corroded, so it was necessary to study by engineering calculations, modeling, and simulation using modern advanced engineering programs and methods. The purpose of this research is to investigate whether the corroded part of the pressure vessel can be used normally in the future using advanced engineering software and to predetermine the remaining life of the time of the pressure vessel based on engineering calculations. When the thickness of the bottom part of the pressure vessel was thinned by 0.5 mm due to corrosion detected by non-destructive testing, finite element analysis using ANSYS WorkBench software was used to determine the mechanical stress, strain and safety factor in the wall and bottom of the pressure vessel operating under 2.2 MPa working pressure, made conclusions on whether it can be used in the future. According to the recommendations, by using sand-blast cleaning and anti-corrosion paint, the normal, continuous and reliable operation of the Concentrator plant can be ensured, such as ordering new pressure vessels and reducing the installation period. By completing this research work, it will be used as a benchmark for assessing the corrosion condition of steel parts of pressure vessels and other metallic and non-metallic structures operating under severe conditions of corrosion, static and dynamic loads, and other deformed steels to make analysis of the structures and make it possible to evaluate and control the integrity and reliable operation of the structures.
Keywords: Corrosion, non-destructive testing, finite element analysis, safety factor, structural reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6130 Evaluation of Four Different DNA Targets in Polymerase Chain Reaction for Detection and Genotyping of Helicobacter pylori
Authors: Abu Salim Mustafa
Abstract:
Polymerase chain reaction (PCR) assays targeting genomic DNA segments have been established for the detection of Helicobacter pylori in clinical specimens. However, the data on comparative evaluations of various targets in detection of H. pylori are limited. Furthermore, the frequencies of vacA (s1 and s2) and cagA genotypes, which are suggested to be involved in the pathogenesis of H. pylori in other parts of the world, are not well studied in Kuwait. The aim of this study was to evaluate PCR assays for the detection and genotyping of H. pylori by targeting the amplification of DNA targets from four genomic segments. The genomic DNA were isolated from 72 clinical isolates of H. pylori and tested in PCR with four pairs of oligonucleotides primers, i.e. ECH-U/ECH-L, ET-5U/ET-5L, CagAF/CagAR and Vac1F/Vac1XR, which were expected to amplify targets of various sizes (471 bp, 230 bp, 183 bp and 176/203 bp, respectively) from the genomic DNA of H. pylori. The PCR-amplified DNA were analyzed by agarose gel electrophoresis. PCR products of expected size were obtained with all primer pairs by using genomic DNA isolated from H. pylori. DNA dilution experiments showed that the most sensitive PCR target was 471 bp DNA amplified by the primers ECH-U/ECH-L, followed by the targets of Vac1F/Vac1XR (176 bp/203 DNA), CagAF/CagAR (183 bp DNA) and ET-5U/ET-5L (230 bp DNA). However, when tested with undiluted genomic DNA isolated from single colonies of all isolates, the Vac1F/Vac1XR target provided the maximum positive results (71/72 (99% positives)), followed by ECH-U/ECH-L (69/72 (93% positives)), ET-5U/ET-5L (51/72 (71% positives)) and CagAF/CagAR (26/72 (46% positives)). The results of genotyping experiments showed that vacA s1 (46% positive) and vacA s2 (54% positive) genotypes were almost equally associated with VaCA+/CagA- isolates (P > 0.05), but with VacA+/CagA+ isolates, S1 genotype (92% positive) was more frequently detected than S2 genotype (8% positive) (P< 0.0001). In conclusion, among the primer pairs tested, Vac1F/Vac1XR provided the best results for detection of H. pylori. The genotyping experiments showed that vacA s1 and vacA s2 genotypes were almost equally associated with vaCA+/cagA- isolates, but vacA s1 genotype had a significantly increased association with vacA+/cagA+ isolates.
Keywords: H. pylori, detection, genotyping, Kuwait.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 603129 Evaluation of Sustainable Business Model Innovation in Increasing the Penetration of Renewable Energy in the Ghana Power Sector
Authors: Victor Birikorang Danquah
Abstract:
Ghana's primary energy supply is heavily reliant on petroleum, biomass, and hydropower. Currently, Ghana gets its energy from hydropower (Akosombo and Bui), thermal power plants powered by crude oil, natural gas, and diesel, solar power, and imports from La Cote d'Ivoire. Until the early 2000s, large hydroelectric dams dominated Ghana's electricity generation. Due to the unreliable weather patterns, Ghana increased its reliance on thermal power. Thermal power contributes the highest percentage in terms of electricity generation in Ghana and is predominantly supplied by Independent Power Producers (IPPs). Ghana's electricity industry operates the corporate utility model as its business model. This model is typically 'vertically integrated', with a single corporation selling the majority of power generated by its generation assets to its retail business, which then sells the electricity to retail market consumers. The corporate utility model has a straightforward value proposition that is based on increasing the number of energy units sold. The unit volume business model drives the entire energy value chain to increase throughput, locking system users into unsustainable practices. This report uses the qualitative research approach to explore the electricity industry in Ghana. There is the need for increasing renewable energy such as wind and solar in the electricity generation. The research recommends two critical business models for the penetration of renewable energy in Ghana's power sector. The first model is the peer-to-peer electricity trading model which relies on a software platform to connect consumers and generators in order for them to trade energy directly with one another. The second model is about encouraging local energy generation, incentivizing optimal time-of-use behaviour, and allow any financial gains to be shared among the community members.
Keywords: business model innovation, electricity generation, renewable energy, solar energy, sustainability, wind energy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859128 Application of Various Methods for Evaluation of Heavy Metal Pollution in Soils around Agarak Copper-Molybdenum Mine Complex, Armenia
Authors: K. A. Ghazaryan, H. S. Movsesyan, N. P. Ghazaryan
Abstract:
The present study was aimed in assessing the heavy metal pollution of the soils around Agarak copper-molybdenum mine complex and related environmental risks. This mine complex is located in the south-east part of Armenia, and the present study was conducted in 2013. The soils of the five riskiest sites of this region were studied: surroundings of the open mine, the sites adjacent to processing plant of Agarak copper-molybdenum mine complex, surroundings of Darazam active tailing dump, the recultivated tailing dump of “ravine - 2”, and the recultivated tailing dump of “ravine - 3”. The mountain cambisol was the main soil type in the study sites. The level of soil contamination by heavy metals was assessed by Contamination factors (Cf), Degree of contamination (Cd), Geoaccumulation index (I-geo) and Enrichment factor (EF). The distribution pattern of trace metals in the soil profile according to Cf, Cd, I-geo and EF values shows that the soil is much polluted. Almost in all studied sites, Cu, Mo, Pb, and Cd were the main polluting heavy metals, and this was conditioned by Agarak copper-molybdenum mine complex activity. It is necessary to state that the pollution problem becomes pressing as some parts of these highly polluted region are inhabited by population, and agriculture is highly developed there; therefore, heavy metals can be transferred into human bodies through food chains and have direct influence on public health. Since the induced pollution can pose serious threats to public health, further investigations on soil and vegetation pollution are recommended. Finally, Cf calculating based on distance from the pollution source and the wind direction can provide more reasonable results.
Keywords: Agarak copper-molybdenum mine complex, heavy metals, soil contamination, enrichment factor, Armenia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1249127 Spatial Clustering Model of Vessel Trajectory to Extract Sailing Routes Based on AIS Data
Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin
Abstract:
The automatic extraction of shipping routes is advantageous for intelligent traffic management systems to identify events and support decision-making in maritime surveillance. At present, there is a high demand for the extraction of maritime traffic networks that resemble the real traffic of vessels accurately, which is valuable for further analytical processing tasks for vessels trajectories (e.g., naval routing and voyage planning, anomaly detection, destination prediction, time of arrival estimation). With the help of big data and processing huge amounts of vessels’ trajectory data, it is possible to learn these shipping routes from the navigation history of past behaviour of other, similar ships that were travelling in a given area. In this paper, we propose a spatial clustering model of vessels’ trajectories (SPTCLUST) to extract spatial representations of sailing routes from historical Automatic Identification System (AIS) data. The whole model consists of three main parts: data preprocessing, path finding, and route extraction, which consists of clustering and representative trajectory extraction. The proposed clustering method provides techniques to overcome the problems of: (i) optimal input parameters selection; (ii) the high complexity of processing a huge volume of multidimensional data; (iii) and the spatial representation of complete representative trajectory detection in the context of trajectory clustering algorithms. The experimental evaluation showed the effectiveness of the proposed model by using a real-world AIS dataset from the Port of Halifax. The results contribute to further understanding of shipping route patterns. This could aid surveillance authorities in stable and sustainable vessel traffic management.
Keywords: Vessel trajectory clustering, trajectory mining, Spatial Clustering, marine intelligent navigation, maritime traffic network extraction, sdailing routes extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 458126 Tribological Aspects of Advanced Roll Material in Cold Rolling of Stainless Steel
Authors: Mohammed Tahir, Jonas Lagergren
Abstract:
Vancron 40, a nitrided powder metallurgical tool Steel, is used in cold work applications where the predominant failure mechanisms are adhesive wear or galling. Typical applications of Vancron 40 are among others fine blanking, cold extrusion, deep drawing and cold work rolls for cluster mills. Vancron 40 positive results for cold work rolls for cluster mills and as a tool for some severe metal forming process makes it competitive compared to other type of work rolls that require higher precision, among others in cold rolling of thin stainless steel, which required high surface finish quality. In this project, three roll materials for cold rolling of stainless steel strip was examined, Vancron 40, Narva 12B (a high-carbon, high-chromium tool steel alloyed with tungsten) and Supra 3 (a Chromium-molybdenum tungsten-vanadium alloyed high speed steel). The purpose of this project was to study the depth profiles of the ironed stainless steel strips, emergence of galling and to study the lubrication performance used by steel industries. Laboratory experiments were conducted to examine scratch of the strip, galling and surface roughness of the roll materials under severe tribological conditions. The critical sliding length for onset of galling was estimated for stainless steel with four different lubricants. Laboratory experiments result of performance evaluation of resistance capability of rolls toward adhesive wear under severe conditions for low and high reductions. Vancron 40 in combination with cold rolling lubricant gave good surface quality, prevents galling of metal surfaces and good bearing capacity.
Keywords: Adhesive wear, Cold rolling, Lubricant, Stainless steel, Surface finish, Vancron 40.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761