Search results for: Step
2370 A Newspapers Expectations Indicator from Web Scraping
Authors: Pilar Rey del Castillo
Abstract:
This document describes the building of an average indicator of the general sentiments about the future exposed in the newspapers in Spain. The raw data are collected through the scraping of the Digital Periodical and Newspaper Library website. Basic tools of natural language processing are later applied to the collected information to evaluate the sentiment strength of each word in the texts using a polarized dictionary. The last step consists of summarizing these sentiments to produce daily indices. The results are a first insight into the applicability of these techniques to produce periodic sentiment indicators.Keywords: natural language processing, periodic indicator, sentiment analysis, web scraping
Procedia PDF Downloads 1332369 12 Real Forensic Caseworks Solved by the DNA STR-Typing of Skeletal Remains Exposed to Extremely Environment Conditions without the Conventional Bone Pulverization Step
Authors: Chiara Della Rocca, Gavino Piras, Andrea Berti, Alessandro Mameli
Abstract:
DNA identification of human skeletal remains plays a valuable role in the forensic field, especially in missing persons and mass disaster investigations. Hard tissues, such as bones and teeth, represent a very common kind of samples analyzed in forensic laboratories because they are often the only biological materials remaining. However, the major limitation of using these compact samples relies on the extremely time–consuming and labor–intensive treatment of grinding them into powder before proceeding with the conventional DNA purification and extraction step. In this context, a DNA extraction assay called the TBone Ex kit (DNA Chip Research Inc.) was developed to digest bone chips without powdering. Here, we simultaneously analyzed bone and tooth samples that arrived at our police laboratory and belonged to 15 different forensic casework that occurred in Sardinia (Italy). A total of 27 samples were recovered from different scenarios and were exposed to extreme environmental factors, including sunlight, seawater, soil, fauna, vegetation, and high temperature and humidity. The TBone Ex kit was used prior to the EZ2 DNA extraction kit on the EZ2 Connect Fx instrument (Qiagen), and high-quality autosomal and Y-chromosome STRs profiles were obtained for the 80% of the caseworks in an extremely short time frame. This study provides additional support for the use of the TBone Ex kit for digesting bone fragments/whole teeth as an effective alternative to pulverization protocols. We empirically demonstrated the effectiveness of the kit in processing multiple bone samples simultaneously, largely simplifying the DNA extraction procedure and the good yield of recovered DNA for downstream genetic typing in highly compromised forensic real specimens. In conclusion, this study turns out to be extremely useful for forensic laboratories, to which the various actors of the criminal justice system – such as potential jury members, judges, defense attorneys, and prosecutors – required immediate feedback.Keywords: DNA, skeletal remains, bones, tbone ex kit, extreme conditions
Procedia PDF Downloads 452368 Hydrogel Hybridizing Temperature-Cured Dissolvable Gelatin Microspheres as Non-Anchorage Dependent Cell Carriers for Tissue Engineering Applications
Authors: Dong-An Wang
Abstract:
All kinds of microspheres have been extensively employed as carriers for drug, gene and therapeutic cell delivery. Most therapeutic cell delivery microspheres rely on a two-step methodology: fabrication of microspheres and subsequent seeding of cells onto them. In this study, we have developed a novel one-step cell encapsulation technique using a convenient and instant water-in-oil single emulsion approach to form cell-encapsulated gelatin microspheres. This technology is adopted for hyaline cartilage tissue engineering, in which autologous chondrocytes are used as therapeutic cells. Cell viability was maintained throughout and after the microsphere formation (75-100 µm diameters) process that avoids involvement of any covalent bonding reactions or exposure to any further chemicals. Further encapsulation of cell-laden microspheres in alginate gels were performed under 4°C via a prompt process. Upon the formation of alginate constructs, they were immediately relocated into CO2 incubator where the temperature was maintained at 37°C; under this temperature, the cell-laden gelatin microspheres dissolved within hours to yield similarly sized cavities and the chondrocytes were therefore suspended within the cavities inside the alginate gel bulk. Hence, the gelatin cell-laden microspheres served two roles: as cell delivery vehicles which can be removable through temperature curing, and as porogens within an alginate hydrogel construct to provide living space for cell growth and tissue development as well as better permeability for mutual diffusions. These cell-laden microspheres, namely “temperature-cured dissolvable gelatin microsphere based cell carriers” (tDGMCs), were further encapsulated in a chondrocyte-laden alginate scaffold system and analyzed by WST-1, gene expression analyses, biochemical assays, histology and immunochemistry stains. The positive results consistently demonstrated the promise of tDGMC technology in delivering these non-anchorage dependent cells (chondrocytes). It can be further conveniently translated into delivery of other non-anchorage dependent cell species, including stem cells, progenitors or iPS cells, for regeneration of tissues in internal organs, such as engineered hepatogenesis or pancreatic regeneration.Keywords: biomaterials, tissue engineering, microsphere, hydrogel, porogen, anchorage dependence
Procedia PDF Downloads 3962367 The Experience with SiC MOSFET and Buck Converter Snubber Design
Authors: Petr Vaculik
Abstract:
The newest semiconductor devices on the market are MOSFET transistors based on the silicon carbide – SiC. This material has exclusive features thanks to which it becomes a better switch than Si – silicon semiconductor switch. There are some special features that need to be understood to enable the device’s use to its full potential. The advantages and differences of SiC MOSFETs in comparison with Si IGBT transistors have been described in first part of this article. Second part describes driver for SiC MOSFET transistor and last part of article represents SiC MOSFET in the application of buck converter (step-down) and design of simple RC snubber.Keywords: SiC, Si, MOSFET, IGBT, SBD, RC snubber
Procedia PDF Downloads 4832366 Microfiber Release During Laundry Under Different Rinsing Parameters
Authors: Fulya Asena Uluç, Ehsan Tuzcuoğlu, Songül Bayraktar, Burak Koca, Alper Gürarslan
Abstract:
Microplastics are contaminants that are widely distributed in the environment with a detrimental ecological effect. Besides this, recent research has proved the existence of microplastics in human blood and organs. Microplastics in the environment can be divided into two main categories: primary and secondary microplastics. Primary microplastics are plastics that are released into the environment as microscopic particles. On the other hand, secondary microplastics are the smaller particles that are shed as a result of the consumption of synthetic materials in textile products as well as other products. Textiles are the main source of microplastic contamination in aquatic ecosystems. Laundry of synthetic textiles (34.8%) accounts for an average annual discharge of 3.2 million tons of primary microplastics into the environment. Recently, microfiber shedding from laundry research has gained traction. However, no comprehensive study was conducted from the standpoint of rinsing parameters during laundry to analyze microfiber shedding. The purpose of the present study is to quantify microfiber shedding from fabric under different rinsing conditions and determine the effective rinsing parameters on microfiber release in a laundry environment. In this regard, a parametric study is carried out to investigate the key factors affecting the microfiber release from a front-load washing machine. These parameters are the amount of water used during the rinsing step and the spinning speed at the end of the washing cycle. Minitab statistical program is used to create a design of the experiment (DOE) and analyze the experimental results. Tests are repeated twice and besides the controlled parameters, other washing parameters are kept constant in the washing algorithm. At the end of each cycle, released microfibers are collected via a custom-made filtration system and weighted with precision balance. The results showed that by increasing the water amount during the rinsing step, the amount of microplastic released from the washing machine increased drastically. Also, the parametric study revealed that increasing the spinning speed results in an increase in the microfiber release from textiles.Keywords: front load, laundry, microfiber, microfiber release, microfiber shedding, microplastic, pollution, rinsing parameters, sustainability, washing parameters, washing machine
Procedia PDF Downloads 972365 The Relationship between Physical Fitness and Academic Performance among University Students
Authors: Bahar Ayberk
Abstract:
The study was conducted to determine the relationship between physical fitness and academic performance among university students. A far-famed saying ‘Sound mind in a sound body’ referring to the potential quality of increased physical fitness in the intellectual development of individuals seems to be endorsed. There is a growing body of literature the impact of physical fitness on academic achievement, especially in elementary and middle-school aged children. Even though there are numerous positive effects related to being physically active and physical fitness, their effect on academic achievement is not very much clear for university students. The subjects for this study included 25 students (20 female and 5 male) enrolled in Yeditepe University, Physiotherapy and Rehabilitation Department of Health Science Faculty. All participants filled in a questionnaire about their socio-demographic status, general health status, and physical activity status. Health-related physical fitness testing, included several core components: 1) body composition evaluation (body mass index, waist-to-hip ratio), 2) cardiovascular endurance evaluation (queen’s college step test), 3) muscle strength and endurance evaluation (sit-up test, push-up test), 4) flexibility evaluation (sit and reach test). Academic performance evaluation was based on student’s Cumulative Grade Point Average (CGPA). The prevalence of the subjects participating physical activity was found to be 40% (n = 10). CGPA scores were significantly higher among students having regular physical activity when we compared the students having regular physical activities or not (respectively 2,71 ± 0.46, 3.02 ± 0.28 scores, p = 0.076). The result of the study also revealed that there is positive correlation relationship between sit-up, push up and academic performance points (CGPA) (r = 0.43, p ≤ 0.05 ) and negative correlation relationship between cardiovascular endurance parameter (Queen's College Step Test) and academic performance points (CGPA) (r = -0.47, p ≤ 0.05). In conclusion, the findings confirmed that physical fitness level was generally associated with academic performance in the study group. Cardiovascular endurance and muscle strength and endurance were associated with student’s CGPA, whereas body composition and flexibility were unrelated to CGPA.Keywords: academic performance, health-related physical fitness, physical activity, physical fitness testing
Procedia PDF Downloads 1632364 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge
Authors: K. Eryilmaz, G. Mercanoglu
Abstract:
Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis
Procedia PDF Downloads 1602363 A Bibliometric Analysis of Trends in Change Management Sciences
Authors: Thomas Lauer
Abstract:
The paper aims to give an overview of change management research by using bibliometric methodology. Based on research papers of the last decade, which are listed on Research Gate, a multidimensional categorization is done. Considering categories like topic (e.g., success factors), industry, or research methodology, the development of the discipline is traced and, in a second step, confronted with external developments of the business environment, such as climate change, gen Z or COVID, to name a few. Based on these findings, a final evaluation concerning the thematical fit of previous research topics is also made, as well as a preview of likely future trends in change management sciences.Keywords: change management, bibliometrics, scientific trends, research topics
Procedia PDF Downloads 622362 Automatic Classification Using Dynamic Fuzzy C Means Algorithm and Mathematical Morphology: Application in 3D MRI Image
Authors: Abdelkhalek Bakkari
Abstract:
Image segmentation is a critical step in image processing and pattern recognition. In this paper, we proposed a new robust automatic image classification based on a dynamic fuzzy c-means algorithm and mathematical morphology. The proposed segmentation algorithm (DFCM_MM) has been applied to MR perfusion images. The obtained results show the validity and robustness of the proposed approach.Keywords: segmentation, classification, dynamic, fuzzy c-means, MR image
Procedia PDF Downloads 4782361 Comparing the Apparent Error Rate of Gender Specifying from Human Skeletal Remains by Using Classification and Cluster Methods
Authors: Jularat Chumnaul
Abstract:
In forensic science, corpses from various homicides are different; there are both complete and incomplete, depending on causes of death or forms of homicide. For example, some corpses are cut into pieces, some are camouflaged by dumping into the river, some are buried, some are burned to destroy the evidence, and others. If the corpses are incomplete, it can lead to the difficulty of personally identifying because some tissues and bones are destroyed. To specify gender of the corpses from skeletal remains, the most precise method is DNA identification. However, this method is costly and takes longer so that other identification techniques are used instead. The first technique that is widely used is considering the features of bones. In general, an evidence from the corpses such as some pieces of bones, especially the skull and pelvis can be used to identify their gender. To use this technique, forensic scientists are required observation skills in order to classify the difference between male and female bones. Although this technique is uncomplicated, saving time and cost, and the forensic scientists can fairly accurately determine gender by using this technique (apparently an accuracy rate of 90% or more), the crucial disadvantage is there are only some positions of skeleton that can be used to specify gender such as supraorbital ridge, nuchal crest, temporal lobe, mandible, and chin. Therefore, the skeletal remains that will be used have to be complete. The other technique that is widely used for gender specifying in forensic science and archeology is skeletal measurements. The advantage of this method is it can be used in several positions in one piece of bones, and it can be used even if the bones are not complete. In this study, the classification and cluster analysis are applied to this technique, including the Kth Nearest Neighbor Classification, Classification Tree, Ward Linkage Cluster, K-mean Cluster, and Two Step Cluster. The data contains 507 particular individuals and 9 skeletal measurements (diameter measurements), and the performance of five methods are investigated by considering the apparent error rate (APER). The results from this study indicate that the Two Step Cluster and Kth Nearest Neighbor method seem to be suitable to specify gender from human skeletal remains because both yield small apparent error rate of 0.20% and 4.14%, respectively. On the other hand, the Classification Tree, Ward Linkage Cluster, and K-mean Cluster method are not appropriate since they yield large apparent error rate of 10.65%, 10.65%, and 16.37%, respectively. However, there are other ways to evaluate the performance of classification such as an estimate of the error rate using the holdout procedure or misclassification costs, and the difference methods can make the different conclusions.Keywords: skeletal measurements, classification, cluster, apparent error rate
Procedia PDF Downloads 2522360 Theoretical Evaluation of the Preparation of Polycyclic Benzimidazole Derivatives
Authors: M. Abdoul-Hakim, A. Zeroual, H. Garmes
Abstract:
In this work, the reaction of 2-chlorobenzimidazole with two distinct 1,3-dipoles such as benzonitrile N-oxide and an azomethine imine was carried out by DFT at the B3LYP/6-311+G(d, p) level to understand the effect of solvent (MeOH). The results show that MeOH has a significant effect on the evolution of the reaction. The charge transfer interactions n(O) → σ*(C-Cl), n(N)→σ*(C-Cl) and σ(N-C) →σ*(C-Cl) stabilize the transition states in an intramolecular nucleophilic substitution (SNi) step of the imidoyl group. Finally, this study provides a theoretical basis for the design of different polycyclic benzimidazole.Keywords: azomethine imine, benzonitrile N-oxide, DFT, intramolecular nucleophilic substitution (SNi), polycyclic benzimidazole
Procedia PDF Downloads 1232359 Interaction Between Task Complexity and Collaborative Learning on Virtual Patient Design: The Effects on Students’ Performance, Cognitive Load, and Task Time
Authors: Fatemeh Jannesarvatan, Ghazaal Parastooei, Jimmy frerejan, Saedeh Mokhtari, Peter Van Rosmalen
Abstract:
Medical and dental education increasingly emphasizes the acquisition, integration, and coordination of complex knowledge, skills, and attitudes that can be applied in practical situations. Instructional design approaches have focused on using real-life tasks in order to facilitate complex learning in both real and simulated environments. The Four component instructional design (4C/ID) model has become a useful guideline for designing instructional materials that improve learning transfer, especially in health profession education. The objective of this study was to apply the 4C/ID model in the creation of virtual patients (VPs) that dental students can use to practice their clinical management and clinical reasoning skills. The study first explored the context and concept of complication factors and common errors for novices and how they can affect the design of a virtual patient program. The study then selected key dental information and considered the content needs of dental students. The design of virtual patients was based on the 4C/ID model's fundamental principles, which included: Designing learning tasks that reflect real patient scenarios and applying different levels of task complexity to challenge students to apply their knowledge and skills in different contexts. Creating varied learning materials that support students during the VP program and are closely integrated with the learning tasks and students' curricula. Cognitive feedback was provided at different levels of the program. Providing procedural information where students followed a step-by-step process from history taking to writing a comprehensive treatment plan. Four virtual patients were designed using the 4C/ID model's principles, and an experimental design was used to test the effectiveness of the principles in achieving the intended educational outcomes. The 4C/ID model provides an effective framework for designing engaging and successful virtual patients that support the transfer of knowledge and skills for dental students. However, there are some challenges and pitfalls that instructional designers should take into account when developing these educational tools.Keywords: 4C/ID model, virtual patients, education, dental, instructional design
Procedia PDF Downloads 802358 Correlation Analysis of Reactivity in the Oxidation of Para and Meta-Substituted Benzyl Alcohols by Benzimidazolium Dichromate in Non-Aqueous Media: A Kinetic and Mechanistic Aspects
Authors: Seema Kothari, Dinesh Panday
Abstract:
An observed correlation of the reaction rates with the changes in the nature of substituent present on one of the reactants often reveals the nature of transition state. Selective oxidation of organic compounds under non-aqueous media is an important transformation in synthetic organic chemistry. Inorganic chromates and dichromates being drastic oxidant and are generally insoluble in most organic solvents, a number of different chromium (VI) derivatives have been synthesized. Benzimidazolium dichromate (BIDC) is one of the recently reported Cr(VI) reagents which is neither hygroscopic nor light sensitive being, therefore, much stable. Not many reports on the kinetics of the oxidations by BIDC are seemed to be available in the literature. In the present investigation, the kinetics and mechanism of benzyl alcohol (BA) and a number of para- and meta-substituted benzyl alcohols by benzimidazolium dichromate (BIDC), in dimethyl sulphoxide, is reported. The reactions were followed spectrophotometrically at 364 nm by monitoring the decrease in [BIDC] for up to 85-90% reaction, the temperature being constant. The observed oxidation product is the corresponding benzaldehyde. The reactions were of first order with respect to each the alcohol and BIDC. The reactions are catalyzed by proton, and the dependence is of the form: kobs = a + b[H+]. The reactions thus follow both, an acid-dependent and acid-independent paths. The oxidation of [1,1 2H2]benzyl alcohol exhibited the presence of a substantial kinetic isotope effect ( kH/kD = 6.20 at 298 K ). This indicated the cleavage of a α-C-H bond in the rate-determining step. An analysis of the temperature dependence of the deuterium isotope effect showed that the loss of hydrogen proceeds through a concerted cyclic process. The rate of oxidation of BA was determined in 19 organic solvents. An analysis of the solvent effect by Swain’s equation indicated that though both the anion and cation-solvating powers of the solvent contribute to the observed solvent effect, the role of cation-solvation is major. The rates of the para and meta compounds, at 298 K, failed to exhibit a significant correlation in terms of Hammett or Brown's substituent constants. The rates were then subjected to analyses in terms of dual substituent parameter (DSP) equations. The rates of oxidation of the para-substituted benzyl alcohols show an excellent correlation with Taft's σI and σRBA values. However, the rates for the meta-substituted benzyl alcohols show an excellent correlation with σI and σR0. The polar reaction constants are negative indicating an electron-deficient transition state. Hence the overall mechanism is proposed to involve the formation of a chromate ester in a fast pre-equilibrium and then a decomposition of the ester in a subsequent slow step via a cyclic concerted symmetrical transition state, involving hydride-ion transfer, leading to the product. The first order dependence on alcohol may be accounted in terms of the small value of the formation constant of the ester intermediate. An another reaction mechanism accounting the acid-catalysis involve the formation of a protonated BIDC prior to formation of an ester intermediate which subsequently decomposes in a slow step leading to the product.Keywords: benzimidazolium dichromate, benzyl alcohols, correlation analysis, kinetics, oxidation
Procedia PDF Downloads 3442357 Spectral Clustering for Manufacturing Cell Formation
Authors: Yessica Nataliani, Miin-Shen Yang
Abstract:
Cell formation (CF) is an important step in group technology. It is used in designing cellular manufacturing systems using similarities between parts in relation to machines so that it can identify part families and machine groups. There are many CF methods in the literature, but there is less spectral clustering used in CF. In this paper, we propose a spectral clustering algorithm for machine-part CF. Some experimental examples are used to illustrate its efficiency. Overall, the spectral clustering algorithm can be used in CF with a wide variety of machine/part matrices.Keywords: group technology, cell formation, spectral clustering, grouping efficiency
Procedia PDF Downloads 4072356 Development of All-in-One Solar Kit
Authors: Azhan Azhar, Mohammed Sakib, Zaurez Ahmad
Abstract:
The energy we receive from the sun is known as solar energy, and it is a reliable, long-lasting, eco-friendly and the most widely used energy source in the 21st century. It is. There are several techniques for harnessing solar energy, and we are all seeing large utility-scale projects to collect maximum amperes from the sun using current technologies. Solar PV is now on the rise as a means of harvesting energy from the sun. Moving a step further, our project is focused on designing an All-in-one portable Solar Energy based solution. We considered the minimum load conditions and evaluated the requirements of various devices utilized in this study to resolve the power requirements of small stores, hawkers, or travelers.Keywords: DOD-depth of discharge, pulse width modulation charge controller, renewable energy, solar PV- solar photovoltaic
Procedia PDF Downloads 3692355 Web Map Service for Fragmentary Rockfall Inventory
Authors: M. Amparo Nunez-Andres, Nieves Lantada
Abstract:
One of the most harmful geological risks is rockfalls. They cause both economic lost, damaged in buildings and infrastructures, and personal ones. Therefore, in order to estimate the risk of the exposed elements, it is necessary to know the mechanism of this kind of events, since the characteristics of the rock walls, to the propagation of fragments generated by the initial detached rock mass. In the framework of the research RockModels project, several inventories of rockfalls were carried out along the northeast of the Spanish peninsula and the Mallorca island. These inventories have general information about the events, although the important fact is that they contained detailed information about fragmentation. Specifically, the IBSD (Insitu Block Size Distribution) is obtained by photogrammetry from drone or TLS (Terrestrial Laser Scanner) and the RBSD (Rock Block Size Distribution) from the volume of the fragment in the deposit measured by hand. In order to share all this information with other scientists, engineers, members of civil protection, and stakeholders, it is necessary a platform accessible from the internet and following interoperable standards. In all the process, open-software have been used: PostGIS 2.1., Geoserver, and OpenLayers library. In the first step, a spatial database was implemented to manage all the information. We have used the data specifications of INSPIRE for natural risks adding specific and detailed data about fragmentation distribution. The next step was to develop a WMS with Geoserver. A previous phase was the creation of several views in PostGIS to show the information at different scales of visualization and with different degrees of detail. In the first view, the sites are identified with a point, and basic information about the rockfall event is facilitated. In the next level of zoom, at medium scale, the convex hull of the rockfall appears with its real shape and the source of the event and fragments are represented by symbols. The queries at this level offer a major detail about the movement. Eventually, the third level shows all elements: deposit, source, and blocks, in their real size, if it is possible, and in their real localization. The last task was the publication of all information in a web mapping site (www.rockdb.upc.edu) with data classified by levels using libraries in JavaScript as OpenLayers.Keywords: geological risk, web mapping, WMS, rockfalls
Procedia PDF Downloads 1602354 Relevant LMA Features for Human Motion Recognition
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets.Keywords: discriminative LMA features, features reduction, human motion recognition, random forest
Procedia PDF Downloads 1952353 Extraction of Cellulose Nanocrystals from Soy Pods
Authors: Maycon dos Santos, Marivane Turim Koschevic, Karina Sayuri Ueda, Marcello Lima Bertuci, Farayde Matta Fackhouri, Silvia Maria Martelli
Abstract:
The use of cellulose nanocrystals as reinforcing agents in polymer nanocomposites is promising. In this study, we tested four different methods of mercerization were divided into two stages. The sample was treated in 5% NaOH solution for 30 minutes at 50 ° C in the first stage and 30vol H2O2 for 2 hours at 50 ° C in the second step, which showed better results. For the extraction of the sample obtained nanocrystals positive result was that the solution was treated with H2SO4 60% (w / w) for 1 hour at 50 ° C. The results were positive and showed that it is possible to extract CNC at low temperatures.Keywords: soy pods, cellulose nanocrystals, temperature, acid concentration
Procedia PDF Downloads 2972352 Towards Automatic Calibration of In-Line Machine Processes
Authors: David F. Nettleton, Elodie Bugnicourt, Christian Wasiak, Alejandro Rosales
Abstract:
In this presentation, preliminary results are given for the modeling and calibration of two different industrial winding MIMO (Multiple Input Multiple Output) processes using machine learning techniques. In contrast to previous approaches which have typically used ‘black-box’ linear statistical methods together with a definition of the mechanical behavior of the process, we use non-linear machine learning algorithms together with a ‘white-box’ rule induction technique to create a supervised model of the fitting error between the expected and real force measures. The final objective is to build a precise model of the winding process in order to control de-tension of the material being wound in the first case, and the friction of the material passing through the die, in the second case. Case 1, Tension Control of a Winding Process. A plastic web is unwound from a first reel, goes over a traction reel and is rewound on a third reel. The objectives are: (i) to train a model to predict the web tension and (ii) calibration to find the input values which result in a given tension. Case 2, Friction Force Control of a Micro-Pullwinding Process. A core+resin passes through a first die, then two winding units wind an outer layer around the core, and a final pass through a second die. The objectives are: (i) to train a model to predict the friction on die2; (ii) calibration to find the input values which result in a given friction on die2. Different machine learning approaches are tested to build models, Kernel Ridge Regression, Support Vector Regression (with a Radial Basis Function Kernel) and MPART (Rule Induction with continuous value as output). As a previous step, the MPART rule induction algorithm was used to build an explicative model of the error (the difference between expected and real friction on die2). The modeling of the error behavior using explicative rules is used to help improve the overall process model. Once the models are built, the inputs are calibrated by generating Gaussian random numbers for each input (taking into account its mean and standard deviation) and comparing the output to a target (desired) output until a closest fit is found. The results of empirical testing show that a high precision is obtained for the trained models and for the calibration process. The learning step is the slowest part of the process (max. 5 minutes for this data), but this can be done offline just once. The calibration step is much faster and in under one minute obtained a precision error of less than 1x10-3 for both outputs. To summarize, in the present work two processes have been modeled and calibrated. A fast processing time and high precision has been achieved, which can be further improved by using heuristics to guide the Gaussian calibration. Error behavior has been modeled to help improve the overall process understanding. This has relevance for the quick optimal set up of many different industrial processes which use a pull-winding type process to manufacture fibre reinforced plastic parts. Acknowledgements to the Openmind project which is funded by Horizon 2020 European Union funding for Research & Innovation, Grant Agreement number 680820Keywords: data model, machine learning, industrial winding, calibration
Procedia PDF Downloads 2412351 A Two-Step Framework for Unsupervised Speaker Segmentation Using BIC and Artificial Neural Network
Authors: Ahmad Alwosheel, Ahmed Alqaraawi
Abstract:
This work proposes a new speaker segmentation approach for two speakers. It is an online approach that does not require a prior information about speaker models. It has two phases, a conventional approach such as unsupervised BIC-based is utilized in the first phase to detect speaker changes and train a Neural Network, while in the second phase, the output trained parameters from the Neural Network are used to predict next incoming audio stream. Using this approach, a comparable accuracy to similar BIC-based approaches is achieved with a significant improvement in terms of computation time.Keywords: artificial neural network, diarization, speaker indexing, speaker segmentation
Procedia PDF Downloads 5022350 The Roles of Education, Policies and Technologies in the Globalization Processes of Creative Industry
Authors: Eureeka Haishang Wu
Abstract:
Creative Industry has been recognized as top priority in many nations for decades, as through globalization processes, culture can be economized by creative industry to develop economies. From non-economic perspectives; creative industry supports nation-identity, enhances global exposure, and improve international relation. In order to enable the globalization processes of creative industry, a three-step approach was proposed to align education, policies, and technologies into a transformation platform, and eventually to achieve a common model of global collaboration.Keywords: creative industry, education, policies, technologies, collaboration, globalization
Procedia PDF Downloads 3432349 A Regulatory Analysis on Legal Problems of BitCoin
Authors: Fady Tawakol
Abstract:
BitCoin is a decentralized cryptocurrency that can be used without the need of traditional central banks to accomplish any e-commerce trade. The use of such currency could facilitate new economic interactions and linkages. However, without effective and efficient regulations, cryptocurrency transactions are mostly used by criminals to commit crimes such as money laundering, theft, and blackmailing. And because law is one step behind technological developments, this paper discusses the importance of regulations and supervision for the BitCoin-system, to provide unified regulatory solutions for our digital future in the Middle East. It will provide a detailed analysis of the legal nature of BitCoin along with, its regulation with respect to criminal and civil law.Keywords: BitCoin, financial protection, crypto currency, money laundering
Procedia PDF Downloads 2092348 Identification and Optimisation of South Africa's Basic Access Road Network
Authors: Diogo Prosdocimi, Don Ross, Matthew Townshend
Abstract:
Road authorities are mandated within limited budgets to both deliver improved access to basic services and facilitate economic growth. This responsibility is further complicated if maintenance backlogs and funding shortfalls exist, as evident in many countries including South Africa. These conditions require authorities to make difficult prioritisation decisions, with the effect that Road Asset Management Systems with a one-dimensional focus on traffic volumes may overlook the maintenance of low-volume roads that provide isolated communities with vital access to basic services. Given these challenges, this paper overlays the full South African road network with geo-referenced information for population, primary and secondary schools, and healthcare facilities to identify the network of connective roads between communities and basic service centres. This connective network is then rationalised according to the Gross Value Added and number of jobs per mesozone, administrative and functional road classifications, speed limit, and road length, location, and name to estimate the Basic Access Road Network. A two-step floating catchment area (2SFCA) method, capturing a weighted assessment of drive-time to service centres and the ratio of people within a catchment area to teachers and healthcare workers, is subsequently applied to generate a Multivariate Road Index. This Index is used to assign higher maintenance priority to roads within the Basic Access Road Network that provide more people with better access to services. The relatively limited incidence of Basic Access Roads indicates that authorities could maintain the entire estimated network without exhausting the available road budget before practical economic considerations get any purchase. Despite this fact, a final case study modelling exercise is performed for the Namakwa District Municipality to demonstrate the extent to which optimal relocation of schools and healthcare facilities could minimise the Basic Access Road Network and thereby release budget for investment in roads that best promote GDP growth.Keywords: basic access roads, multivariate road index, road prioritisation, two-step floating catchment area method
Procedia PDF Downloads 2312347 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System
Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa
Abstract:
Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)
Procedia PDF Downloads 3092346 1D/3D Modeling of a Liquid-Liquid Two-Phase Flow in a Milli-Structured Heat Exchanger/Reactor
Authors: Antoinette Maarawi, Zoe Anxionnaz-Minvielle, Pierre Coste, Nathalie Di Miceli Raimondi, Michel Cabassud
Abstract:
Milli-structured heat exchanger/reactors have been recently widely used, especially in the chemical industry, due to their enhanced performances in heat and mass transfer compared to conventional apparatuses. In our work, the ‘DeanHex’ heat exchanger/reactor with a 2D-meandering channel is investigated both experimentally and numerically. The square cross-sectioned channel has a hydraulic diameter of 2mm. The aim of our study is to model local physico-chemical phenomena (heat and mass transfer, axial dispersion, etc.) for a liquid-liquid two-phase flow in our lab-scale meandering channel, which represents the central part of the heat exchanger/reactor design. The numerical approach of the reactor is based on a 1D model for the flow channel encapsulated in a 3D model for the surrounding solid, using COMSOL Multiphysics V5.5. The use of the 1D approach to model the milli-channel reduces significantly the calculation time compared to 3D approaches, which are generally focused on local effects. Our 1D/3D approach intends to bridge the gap between the simulation at a small scale and the simulation at the reactor scale at a reasonable CPU cost. The heat transfer process between the 1D milli-channel and its 3D surrounding is modeled. The feasibility of this 1D/3D coupling was verified by comparing simulation results to experimental ones originated from two previous works. Temperature profiles along the channel axis obtained by simulation fit the experimental profiles for both cases. The next step is to integrate the liquid-liquid mass transfer model and to validate it with our experimental results. The hydrodynamics of the liquid-liquid two-phase system is modeled using the ‘mixture model approach’. The mass transfer behavior is represented by an overall volumetric mass transfer coefficient ‘kLa’ correlation obtained from our experimental results in the millimetric size meandering channel. The present work is a first step towards the scale-up of our ‘DeanHex’ expecting future industrialization of such equipment. Therefore, a generalized scaled-up model of the reactor comprising all the transfer processes will be built in order to predict the performance of the reactor in terms of conversion rate and energy efficiency at an industrial scale.Keywords: liquid-liquid mass transfer, milli-structured reactor, 1D/3D model, process intensification
Procedia PDF Downloads 1302345 Reliability-Simulation of Composite Tubular Structure under Pressure by Finite Elements Methods
Authors: Abdelkader Hocine, Abdelhakim Maizia
Abstract:
The exponential growth of reinforced fibers composite materials use has prompted researchers to step up their work on the prediction of their reliability. Owing to differences between the properties of the materials used for the composite, the manufacturing processes, the load combinations and types of environment, the prediction of the reliability of composite materials has become a primary task. Through failure criteria, TSAI-WU and the maximum stress, the reliability of multilayer tubular structures under pressure is the subject of this paper, where the failure probability of is estimated by the method of Monte Carlo.Keywords: composite, design, monte carlo, tubular structure, reliability
Procedia PDF Downloads 4642344 Analysis of Threats in Interoperability of Medical Devices
Authors: M. Sandhya, R. M. Madhumitha, Sharmila Sankar
Abstract:
Interoperable medical devices (IMDs) face threats due to the increased attack surface accessible by interoperability and the corresponding infrastructure. Initiating networking and coordination functionalities primarily modify medical systems' security properties. Understanding the threats is a vital first step in ultimately crafting security solutions for such systems. The key to this problem is coming up with some common types of threats or attacks with those of security and privacy, and providing this information as a roadmap. This paper analyses the security issues in interoperability of devices and presents the main types of threats that have to be considered to build a secured system.Keywords: interoperability, threats, attacks, medical devices
Procedia PDF Downloads 3332343 Rural Livelihood under a Changing Climate Pattern in the Zio District of Togo, West Africa
Authors: Martial Amou
Abstract:
This study was carried out to assess the situation of households’ livelihood under a changing climate pattern in the Zio district of Togo, West Africa. The study examined three important aspects: (i) assessment of households’ livelihood situation under a changing climate pattern, (ii) farmers’ perception and understanding of local climate change, (iii) determinants of adaptation strategies undertaken in cropping pattern to climate change. To this end, secondary sources of data, and survey data collected from 235 farmers in four villages in the study area were used. Adapted conceptual framework from Sustainable Livelihood Framework of DFID, two steps Binary Logistic Regression Model and descriptive statistics were used in this study as methodological approaches. Based on Sustainable Livelihood Approach (SLA), various factors revolving around the livelihoods of the rural community were grouped into social, natural, physical, human, and financial capital. Thus, the study came up that households’ livelihood situation represented by the overall livelihood index in the study area (34%) is below the standard average households’ livelihood security index (50%). The natural capital was found as the poorest asset (13%) and this will severely affect the sustainability of livelihood in the long run. The result from descriptive statistics and the first step regression (selection model) indicated that most of the farmers in the study area have clear understanding of climate change even though they do not have any idea about greenhouse gases as the main cause behind the issue. From the second step regression (output model) result, education, farming experience, access to credit, access to extension services, cropland size, membership of a social group, distance to the nearest input market, were found to be the significant determinants of adaptation measures undertaken in cropping pattern by farmers in the study area. Based on the result of this study, recommendations are made to farmers, policy makers, institutions, and development service providers in order to better target interventions which build, promote or facilitate the adoption of adaptation measures with potential to build resilience to climate change and then improve rural livelihood.Keywords: climate change, rural livelihood, cropping pattern, adaptation, Zio District
Procedia PDF Downloads 3252342 Durability Analysis of a Knuckle Arm Using VPG System
Authors: Geun-Yeon Kim, S. P. Praveen Kumar, Kwon-Hee Lee
Abstract:
A steering knuckle arm is the component that connects the steering system and suspension system. The structural performances such as stiffness, strength, and durability are considered in its design process. The former study suggested the lightweight design of a knuckle arm considering the structural performances and using the metamodel-based optimization. The six shape design variables were defined, and the optimum design was calculated by applying the kriging interpolation method. The finite element method was utilized to predict the structural responses. The suggested knuckle was made of the aluminum Al6082, and its weight was reduced about 60% in comparison with the base steel knuckle, satisfying the design requirements. Then, we investigated its manufacturability by performing foraging analysis. The forging was done as hot process, and the product was made through two-step forging. As a final step of its developing process, the durability is investigated by using the flexible dynamic analysis software, LS-DYNA and the pre and post processor, eta/VPG. Generally, a car make does not provide all the information with the part manufacturer. Thus, the part manufacturer has a limit in predicting the durability performance with the unit of full car. The eta/VPG has the libraries of suspension, tire, and road, which are commonly used parts. That makes a full car modeling. First, the full car is modeled by referencing the following information; Overall Length: 3,595mm, Overall Width: 1,595mm, CVW (Curve Vehicle Weight): 910kg, Front Suspension: MacPherson Strut, Rear Suspension: Torsion Beam Axle, Tire: 235/65R17. Second, the road is selected as the cobblestone. The road condition of the cobblestone is almost 10 times more severe than that of usual paved road. Third, the dynamic finite element analysis using the LS-DYNA is performed to predict the durability performance of the suggested knuckle arm. The life of the suggested knuckle arm is calculated as 350,000km, which satisfies the design requirement set up by the part manufacturer. In this study, the overall design process of a knuckle arm is suggested, and it can be seen that the developed knuckle arm satisfies the design requirement of the durability with the unit of full car. The VPG analysis is successfully performed even though it does not an exact prediction since the full car model is very rough one. Thus, this approach can be used effectively when the detail to full car is not given.Keywords: knuckle arm, structural optimization, Metamodel, forging, durability, VPG (Virtual Proving Ground)
Procedia PDF Downloads 4192341 Identification of COVID-SARS Variants Based on Lactate Test Results
Authors: Zoltan Horvath, Dora Nagy
Abstract:
In this research, it was examined whether individual COVID variants cause differences in the lactate curve of cyclists. After all, the virus variants attacked different organs in our body during the infections. During our tests, we used a traditional lactate step test, the results of which were compared with the values before the infection. In the tests, it has been proven that different virus variants show unique lactate curves. In this way, based on the lactate curve, it is possible to identify which variant caused the disease. Thanks to this, it has been shorten the return time, because we can apply the best return protocol after infection to the competitors.Keywords: COVID-Sars19, lactate, virus mutation, lactate profile
Procedia PDF Downloads 66