Search results for: search algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3773

Search results for: search algorithms

2693 Emergency Multidisciplinary Continuing Care Case Management

Authors: Mekroud Amel

Abstract:

Emergency departments are known for the workload, the variety of pathologies and the difficulties in their management with the continuous influx of patients The role of our service in the management of patients with two or three mild to moderate organ failures, involving several disciplines at the same time, as well as the effect of this management on the skills and efficiency of our team has been demonstrated Borderline cases between two or three or even more disciplines, with instability of a vital function, which have been successfully managed in the emergency room, the therapeutic procedures adopted, the consequences on the quality and level of care delivered by our team, as well as that the logistical consequences, and the pedagogical consequences are demonstrated. The consequences found are Positive on the emergency teams, in rare situations are negative Regarding clinical situations, it is the entanglement of hemodynamic distress with right, left or global participation, tamponade, low flow with acute pulmonary edema, and/or state of shock With respiratory distress with more or less profound hypoxemia, with haematosis disorder related to a bacterial or viral lung infection, pleurisy, pneumothorax, bronchoconstrictive crisis. With neurological disorders such as recent stroke, comatose state, or others With metabolic disorders such as hyperkalaemia renal insufficiency severe ionic disorders with accidents with anti vitamin K With or without septate effusion of one or more serous membranes with or without tamponade It’s a Retrospective, monocentric, descriptive study Period 05.01.2022 to 10.31.2022 the purpose of our work: Search for a statistically significant link between the type of moderate to severe pathology managed in the emergency room whose problems are multivisceral on the efficiency of the healthcare team and its level of care and optional care offered for patients Statistical Test used: Chi2 test to prove the significant link between the resolution of serious multidisciplinary cases in the emergency room and the effectiveness of the team in the management of complicated cases Search for a statistically significant link : The management of the most difficult clinical cases for organ specialties has given general practitioner emergency teams a great perspective and has been able to improve their efficiency in the face of emergencies received

Keywords: emergency care teams, management of patients with dysfunction of more than one organ, learning curve, quality of care

Procedia PDF Downloads 80
2692 Renewable Energy and Hydrogen On-Site Generation for Drip Irrigation and Agricultural Machinery

Authors: Javier Carroquino, Nieves García-Casarejos, Pilar Gargallo, F. Javier García-Ramos

Abstract:

The energy used in agriculture is a source of global emissions of greenhouse gases. The two main types of this energy are electricity for pumping and diesel for agricultural machinery. In order to reduce these emissions, the European project LIFE REWIND addresses the supply of this demand from renewable sources. First of all, comprehensive data on energy demand and available renewable resources have been obtained in several case studies. Secondly, a set of simulations and optimizations have been performed, in search of the best configuration and sizing, both from an economic and emission reduction point of view. For this purpose, it was used software based on genetic algorithms. Thirdly, a prototype has been designed and installed, that it is being used for the validation in a real case. Finally, throughout a year of operation, various technical and economic parameters are being measured for further analysis. The prototype is not connected to the utility grid, avoiding the cost and environmental impact of a grid extension. The system includes three kinds of photovoltaic fields. One is located on a fixed structure on the terrain. Another one is floating on an irrigation raft. The last one is mounted on a two axis solar tracker. Each has its own solar inverter. The total amount of nominal power is 44 kW. A lead acid battery with 120 kWh of capacity carries out the energy storage. Three isolated inverters support a three phase, 400 V 50 Hz micro-grid, the same characteristics of the utility grid. An advanced control subsystem has been constructed, using free hardware and software. The electricity produced feeds a set of seven pumps used for purification, elevation and pressurization of water in a drip irrigation system located in a vineyard. Since the irrigation season does not include the whole year, as well as a small oversize of the generator, there is an amount of surplus energy. With this surplus, a hydrolyser produces on site hydrogen by electrolysis of water. An off-road vehicle with fuel cell feeds on that hydrogen and carries people in the vineyard. The only emission of the process is high purity water. On the one hand, the results show the technical and economic feasibility of stand-alone renewable energy systems to feed seasonal pumping. In this way, the economic costs, the environmental impacts and the landscape impacts of grid extensions are avoided. The use of diesel gensets and their associated emissions are also avoided. On the other hand, it is shown that it is possible to replace diesel in agricultural machinery, substituting it for electricity or hydrogen of 100% renewable origin and produced on the farm itself, without any external energy input. In addition, it is expected to obtain positive effects on the rural economy and employment, which will be quantified through interviews.

Keywords: drip irrigation, greenhouse gases, hydrogen, renewable energy, vineyard

Procedia PDF Downloads 343
2691 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection

Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa

Abstract:

Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.

Keywords: classification, airborne LiDAR, parameters selection, support vector machine

Procedia PDF Downloads 147
2690 The Relationship between Body Positioning and Badminton Smash Quality

Authors: Gongbing Shan, Shiming Li, Zhao Zhang, Bingjun Wan

Abstract:

Badminton originated in ancient civilizations in Europe and Asia more than 2000 years ago. Presently, it is played almost everywhere with estimated 220 million people playing badminton regularly, ranging from professionals to recreational players; and it is the second most played sport in the world after soccer. In Asia, the popularity of badminton and involvement of people surpass soccer. Unfortunately, scientific researches on badminton skills are hardly proportional to badminton’s popularity. A search of literature has shown that the literature body of biomechanical investigations is relatively small. One of the dominant skills in badminton is the forehand overhead smash, which consists of 1/5 attacks during games. Empirical evidences show that one has to adjust the body position in relation to the coming shuttlecock to produce a powerful and accurate smash. Therefore, positioning is a fundamental aspect influencing smash quality. A search of literature has shown that there is a dearth/lack of study on this fundamental aspect. The goals of this study were to determine the influence of positioning and training experience on smash quality in order to discover information that could help learn/acquire the skill. Using a 10-camera, 3D motion capture system (VICON MX, 200 frames/s) and 15-segment, full-body biomechanical model, 14 skilled and 15 novice players were measured and analyzed. Results have revealed that the body positioning has direct influence on the quality of a smash, especially on shuttlecock release angle and clearance height (passing over the net) of offensive players. The results also suggest that, for training a proper positioning, one could conduct a self-selected comfort position towards a statically hanged shuttlecock and then step one foot back – a practical reference marker for learning. This perceptional marker could be applied in guiding the learning and training of beginners. As one gains experience through repetitive training, improved limbs’ coordination would increase smash quality further. The researchers hope that the findings will benefit practitioners for developing effective training programs for beginners.

Keywords: 3D motion analysis, biomechanical modeling, shuttlecock release speed, shuttlecock release angle, clearance height

Procedia PDF Downloads 498
2689 Convolutional Neural Networks Architecture Analysis for Image Captioning

Authors: Jun Seung Woo, Shin Dong Ho

Abstract:

The Image Captioning models with Attention technology have developed significantly compared to previous models, but it is still unsatisfactory in recognizing images. We perform an extensive search over seven interesting Convolutional Neural Networks(CNN) architectures to analyze the behavior of different models for image captioning. We compared seven different CNN Architectures, according to batch size, using on public benchmarks: MS-COCO datasets. In our experimental results, DenseNet and InceptionV3 got about 14% loss and about 160sec training time per epoch. It was the most satisfactory result among the seven CNN architectures after training 50 epochs on GPU.

Keywords: deep learning, image captioning, CNN architectures, densenet, inceptionV3

Procedia PDF Downloads 133
2688 Thermal Conductivity and Diffusivity of Alternative Refrigerants as Retrofit for Freon 12

Authors: Mutalubi Aremu Akintunde, John Isa

Abstract:

The negative impact on the atmosphere, of chlorofluorocarbon refrigerants (CFC) radical changes and measures were put in place to replace them. This has led to search for alternative refrigerants over the past decades. This paper presents thermal conductivity, diffusivity and performance of two alternative refrigerants as replacement to R12, which has been a versatile refrigerant which had turned the refrigeration industries around for decades, but one of the offensive refrigerants. The new refrigerants were coded RA1 (50%R600a/50%R134a;) and RA2 (70%R600a/30%R134a). The diffusivities for RA1 and RA2 were estimated to be, 2.76384 X 10-8 m2/s and 2.74386 X 10-8 m2/s respectively, while that of R12 under the same experimental condition is 2.43772 X 10-8 m2/s. The performances of the two refrigerants in a refrigerator initially designed for R12, were very close to that of R12. Other thermodynamic parameters showed that R12 can be replaced with both RA1 and RA2.

Keywords: alternative refrigerants, conductivity, diffusivity, performance, refrigerants

Procedia PDF Downloads 162
2687 Maximum Power Point Tracking Using FLC Tuned with GA

Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli

Abstract:

The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic Controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.

Keywords: fuzzy logic controller, fuzzy logic, genetic algorithm, maximum power point, maximum power point tracking

Procedia PDF Downloads 373
2686 The Role of Geodiversity in Earthquake Risk Management Strategies in Haiti

Authors: Djimy Dolcin

Abstract:

Haiti is a victim of the seismic threat, due to its geographical location and geodynamic context. Moreover, the vulnerability of the population is aggravated by the occupation of areas highly exposed to this threat. This work, therefore, presents an analysis of seismic risk management in Haiti in the context of geodiversity and its potential for understanding risk. To carry out this work, a bibliographical search was carried out on the subject. Faced with this state of affairs, we realized that the implementation of information and education strategies aimed at the population, which until now has been unaware of the danger it faces, is a fundamental obligation.

Keywords: geodiversity, earthquake risk management, Haiti, earthquake risk

Procedia PDF Downloads 3
2685 A Method of the Semantic on Image Auto-Annotation

Authors: Lin Huo, Xianwei Liu, Jingxiong Zhou

Abstract:

Recently, due to the existence of semantic gap between image visual features and human concepts, the semantic of image auto-annotation has become an important topic. Firstly, by extract low-level visual features of the image, and the corresponding Hash method, mapping the feature into the corresponding Hash coding, eventually, transformed that into a group of binary string and store it, image auto-annotation by search is a popular method, we can use it to design and implement a method of image semantic auto-annotation. Finally, Through the test based on the Corel image set, and the results show that, this method is effective.

Keywords: image auto-annotation, color correlograms, Hash code, image retrieval

Procedia PDF Downloads 497
2684 Transforming Data Science Curriculum Through Design Thinking

Authors: Samar Swaid

Abstract:

Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.

Keywords: data science, design thinking, AI, currculum, transformation

Procedia PDF Downloads 81
2683 Abilitest Battery: Presentation of Tests and Psychometric Properties

Authors: Sylwia Sumińska, Łukasz Kapica, Grzegorz Szczepański

Abstract:

Introduction: Cognitive skills are a crucial part of everyday functioning. Cognitive skills include perception, attention, language, memory, executive functions, and higher cognitive skills. With the aging of societies, there is an increasing percentage of people whose cognitive skills decline. Cognitive skills affect work performance. The appropriate diagnosis of a worker’s cognitive skills reduces the risk of errors and accidents at work which is also important for senior workers. The study aimed to prepare new cognitive tests for adults aged 20-60 and assess the psychometric properties of the tests. The project responds to the need for reliable and accurate methods of assessing cognitive performance. Computer tests were developed to assess psychomotor performance, attention, and working memory. Method: Two hundred eighty people aged 20-60 will participate in the study in 4 age groups. Inclusion criteria for the study were: no subjective cognitive impairment, no history of severe head injuries, chronic diseases, psychiatric and neurological diseases. The research will be conducted from February - to June 2022. Cognitive tests: 1) Measurement of psychomotor performance: Reaction time, Reaction time with selective attention component; 2) Measurement of sustained attention: Visual search (dots), Visual search (numbers); 3) Measurement of working memory: Remembering words, Remembering letters. To assess the validity and the reliability subjects will perform the Vienna Test System, i.e., “Reaction Test” (reaction time), “Signal Detection” (sustained attention), “Corsi Block-Tapping Test” (working memory), and Perception and Attention Test (TUS), Colour Trails Test (CTT), Digit Span – subtest from The Wechsler Adult Intelligence Scale. Eighty people will be invited to a session after three months aimed to assess the consistency over time. Results: Due to ongoing research, the detailed results from 280 people will be shown at the conference separately in each age group. The results of correlation analysis with the Vienna Test System will be demonstrated as well.

Keywords: aging, attention, cognitive skills, cognitive tests, psychomotor performance, working memory

Procedia PDF Downloads 105
2682 Machine Learning in Agriculture: A Brief Review

Authors: Aishi Kundu, Elhan Raza

Abstract:

"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.

Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting

Procedia PDF Downloads 105
2681 Evolutional Substitution Cipher on Chaotic Attractor

Authors: Adda Ali-Pacha, Naima Hadj-Said

Abstract:

Nowadays, the security of information is primarily founded on the calculation of algorithms that confidentiality depend on the number of bits necessary to define a cryptographic key. In this work, we introduce a new chaotic cryptosystem that we call evolutional substitution cipher on a chaotic attractor. In this research paper, we take the Henon attractor. The evolutional substitution cipher on Henon attractor is based on the principle of monoalphabetic cipher and it associates the plaintext at a succession of real numbers calculated from the attractor equations.

Keywords: cryptography, substitution cipher, chaos theory, Henon attractor, evolutional substitution cipher

Procedia PDF Downloads 430
2680 Principal Component Analysis Combined Machine Learning Techniques on Pharmaceutical Samples by Laser Induced Breakdown Spectroscopy

Authors: Kemal Efe Eseller, Göktuğ Yazici

Abstract:

Laser-induced breakdown spectroscopy (LIBS) is a rapid optical atomic emission spectroscopy which is used for material identification and analysis with the advantages of in-situ analysis, elimination of intensive sample preparation, and micro-destructive properties for the material to be tested. LIBS delivers short pulses of laser beams onto the material in order to create plasma by excitation of the material to a certain threshold. The plasma characteristics, which consist of wavelength value and intensity amplitude, depends on the material and the experiment’s environment. In the present work, medicine samples’ spectrum profiles were obtained via LIBS. Medicine samples’ datasets include two different concentrations for both paracetamol based medicines, namely Aferin and Parafon. The spectrum data of the samples were preprocessed via filling outliers based on quartiles, smoothing spectra to eliminate noise and normalizing both wavelength and intensity axis. Statistical information was obtained and principal component analysis (PCA) was incorporated to both the preprocessed and raw datasets. The machine learning models were set based on two different train-test splits, which were 70% training – 30% test and 80% training – 20% test. Cross-validation was preferred to protect the models against overfitting; thus the sample amount is small. The machine learning results of preprocessed and raw datasets were subjected to comparison for both splits. This is the first time that all supervised machine learning classification algorithms; consisting of Decision Trees, Discriminant, naïve Bayes, Support Vector Machines (SVM), k-NN(k-Nearest Neighbor) Ensemble Learning and Neural Network algorithms; were incorporated to LIBS data of paracetamol based pharmaceutical samples, and their different concentrations on preprocessed and raw dataset in order to observe the effect of preprocessing.

Keywords: machine learning, laser-induced breakdown spectroscopy, medicines, principal component analysis, preprocessing

Procedia PDF Downloads 87
2679 Use of Thrombolytics for Acute Myocardial Infarctions in Resource-Limited Settings, Globally: A Systematic Literature Review

Authors: Sara Zelman, Courtney Meyer, Hiren Patel, Lisa Philpotts, Sue Lahey, Thomas Burke

Abstract:

Background: As the global burden of disease shifts from infectious diseases to noncommunicable diseases, there is growing urgency to provide treatment for time-sensitive illnesses, such as ST-Elevation Myocardial Infarctions (STEMIs). The standard of care for STEMIs in developed countries is Percutaneous Coronary Intervention (PCI). However, this is inaccessible in resource-limited settings. Before the discovery of PCI, Streptokinase (STK) and other thrombolytic drugs were first-line treatments for STEMIs. STK has been recognized as a cost-effective and safe treatment for STEMIs; however, in settings which lack access to PCI, it has not become the established second-line therapy. A systematic literature review was conducted to geographically map the use of STK for STEMIs in resource-limited settings. Methods: Our literature review group searched the databases Cinhal, Embase, Ovid, Pubmed, Web of Science, and WHO’s Index Medicus. The search terms included ‘thrombolytics’ AND ‘myocardial infarction’ AND ‘resource-limited’ and were restricted to human studies and papers written in English. A considerable number of studies came from Latin America; however, these studies were not written in English and were excluded. The initial search yielded 3,487 articles, which was reduced to 3,196 papers after titles were screened. Three medical professionals then screened abstracts, from which 291 articles were selected for full-text review and 94 papers were chosen for final inclusion. These articles were then analyzed and mapped geographically. Results: This systematic literature review revealed that STK has been used for the treatment of STEMIs in 33 resource-limited countries, with 18 of 94 studies taking place in India. Furthermore, 13 studies occurred in Pakistan, followed by Iran (6), Sri Lanka (5), Brazil (4), China (4), and South Africa (4). Conclusion: Our systematic review revealed that STK has been used for the treatment of STEMIs in 33 resource-limited countries, with the highest utilization occurring in India. This demonstrates that even though STK has high utility for STEMI treatment in resource-limited settings, it still has not become the standard of care. Future research should investigate the barriers preventing the establishment of STK use as second-line treatment after PCI.

Keywords: cardiovascular disease, global health, resource-limited setting, ST-Elevation Myocardial Infarction, Streptokinase

Procedia PDF Downloads 146
2678 Sequence Analysis and Molecular Cloning of PROTEOLYSIS 6 in Tomato

Authors: Nurulhikma Md Isa, Intan Elya Suka, Nur Farhana Roslan, Chew Bee Lynn

Abstract:

The evolutionarily conserved N-end rule pathway marks proteins for degradation by the Ubiquitin Proteosome System (UPS) based on the nature of their N-terminal residue. Proteins with a destabilizing N-terminal residue undergo a series of condition-dependent N-terminal modifications, resulting in their ubiquitination and degradation. Intensive research has been carried out in Arabidopsis previously. The group VII Ethylene Response Factor (ERFs) transcription factors are the first N-end rule pathway substrates found in Arabidopsis and their role in regulating oxygen sensing. ERFs also function as central hubs for the perception of gaseous signals in plants and control different plant developmental including germination, stomatal aperture, hypocotyl elongation and stress responses. However, nothing is known about the role of this pathway during fruit development and ripening aspect. The plant model system Arabidopsis cannot represent fleshy fruit model system therefore tomato is the best model plant to study. PROTEOLYSIS6 (PRT6) is an E3 ubiquitin ligase of the N-end rule pathway. Two homologs of PRT6 sequences have been identified in tomato genome database using the PRT6 protein sequence from model plant Arabidopsis thaliana. Homology search against Ensemble Plant database (tomato) showed Solyc09g010830.2 is the best hit with highest score of 1143, e-value of 0.0 and 61.3% identity compare to the second hit Solyc10g084760.1. Further homology search was done using NCBI Blast database to validate the data. The result showed best gene hit was XP_010325853.1 of uncharacterized protein LOC101255129 (Solanum lycopersicum) with highest score of 1601, e-value 0.0 and 48% identity. Both Solyc09g010830.2 and uncharacterized protein LOC101255129 were genes located at chromosome 9. Further validation was carried out using BLASTP program between these two sequences (Solyc09g010830.2 and uncharacterized protein LOC101255129) to investigate whether they were the same proteins represent PRT6 in tomato. Results showed that both proteins have 100 % identity, indicates that they were the same gene represents PRT6 in tomato. In addition, we used two different RNAi constructs that were driven under 35S and Polygalacturonase (PG) promoters to study the function of PRT6 during tomato developmental stages and ripening processes.

Keywords: ERFs, PRT6, tomato, ubiquitin

Procedia PDF Downloads 240
2677 Financial Ethics: A Review of 2010 Flash Crash

Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid

Abstract:

Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.

Keywords: flash crash, market crash, stock market, stock market crash

Procedia PDF Downloads 520
2676 FracXpert: Ensemble Machine Learning Approach for Localization and Classification of Bone Fractures in Cricket Athletes

Authors: Madushani Rodrigo, Banuka Athuraliya

Abstract:

In today's world of medical diagnosis and prediction, machine learning stands out as a strong tool, transforming old ways of caring for health. This study analyzes the use of machine learning in the specialized domain of sports medicine, with a focus on the timely and accurate detection of bone fractures in cricket athletes. Failure to identify bone fractures in real time can result in malunion or non-union conditions. To ensure proper treatment and enhance the bone healing process, accurately identifying fracture locations and types is necessary. When interpreting X-ray images, it relies on the expertise and experience of medical professionals in the identification process. Sometimes, radiographic images are of low quality, leading to potential issues. Therefore, it is necessary to have a proper approach to accurately localize and classify fractures in real time. The research has revealed that the optimal approach needs to address the stated problem and employ appropriate radiographic image processing techniques and object detection algorithms. These algorithms should effectively localize and accurately classify all types of fractures with high precision and in a timely manner. In order to overcome the challenges of misidentifying fractures, a distinct model for fracture localization and classification has been implemented. The research also incorporates radiographic image enhancement and preprocessing techniques to overcome the limitations posed by low-quality images. A classification ensemble model has been implemented using ResNet18 and VGG16. In parallel, a fracture segmentation model has been implemented using the enhanced U-Net architecture. Combining the results of these two implemented models, the FracXpert system can accurately localize exact fracture locations along with fracture types from the available 12 different types of fracture patterns, which include avulsion, comminuted, compressed, dislocation, greenstick, hairline, impacted, intraarticular, longitudinal, oblique, pathological, and spiral. This system will generate a confidence score level indicating the degree of confidence in the predicted result. Using ResNet18 and VGG16 architectures, the implemented fracture segmentation model, based on the U-Net architecture, achieved a high accuracy level of 99.94%, demonstrating its precision in identifying fracture locations. Simultaneously, the classification ensemble model achieved an accuracy of 81.0%, showcasing its ability to categorize various fracture patterns, which is instrumental in the fracture treatment process. In conclusion, FracXpert has become a promising ML application in sports medicine, demonstrating its potential to revolutionize fracture detection processes. By leveraging the power of ML algorithms, this study contributes to the advancement of diagnostic capabilities in cricket athlete healthcare, ensuring timely and accurate identification of bone fractures for the best treatment outcomes.

Keywords: multiclass classification, object detection, ResNet18, U-Net, VGG16

Procedia PDF Downloads 121
2675 Development of Numerical Method for Mass Transfer across the Moving Membrane with Selective Permeability: Approximation of the Membrane Shape by Level Set Method for Numerical Integral

Authors: Suguru Miyauchi, Toshiyuki Hayase

Abstract:

Biological membranes have selective permeability, and the capsules or cells enclosed by the membrane show the deformation by the osmotic flow. This mass transport phenomenon is observed everywhere in a living body. For the understanding of the mass transfer in a body, it is necessary to consider the mass transfer phenomenon across the membrane as well as the deformation of the membrane by a flow. To our knowledge, in the numerical analysis, the method for mass transfer across the moving membrane has not been established due to the difficulty of the treating of the mass flux permeating through the moving membrane with selective permeability. In the existing methods for the mass transfer across the membrane, the approximate delta function is used to communicate the quantities on the interface. The methods can reproduce the permeation of the solute, but cannot reproduce the non-permeation. Moreover, the computational accuracy decreases with decreasing of the permeable coefficient of the membrane. This study aims to develop the numerical method capable of treating three-dimensional problems of mass transfer across the moving flexible membrane. One of the authors developed the numerical method with high accuracy based on the finite element method. This method can capture the discontinuity on the membrane sharply due to the consideration of the jumps in concentration and concentration gradient in the finite element discretization. The formulation of the method takes into account the membrane movement, and both permeable and non-permeable membranes can be treated. However, searching the cross points of the membrane and fluid element boundaries and splitting the fluid element into sub-elements are needed for the numerical integral. Therefore, cumbersome operation is required for a three-dimensional problem. In this paper, we proposed an improved method to avoid the search and split operations, and confirmed its effectiveness. The membrane shape was treated implicitly by introducing the level set function. As the construction of the level set function, the membrane shape in one fluid element was expressed by the shape function of the finite element method. By the numerical experiment, it was found that the shape function with third order appropriately reproduces the membrane shapes. The same level of accuracy compared with the previous method using search and split operations was achieved by using a number of sampling points of the numerical integral. The effectiveness of the method was confirmed by solving several model problems.

Keywords: finite element method, level set method, mass transfer, membrane permeability

Procedia PDF Downloads 250
2674 Performance Evaluation of Packet Scheduling with Channel Conditioning Aware Based on Wimax Networks

Authors: Elmabruk Laias, Abdalla M. Hanashi, Mohammed Alnas

Abstract:

Worldwide Interoperability for Microwave Access (WiMAX) became one of the most challenging issues, since it was responsible for distributing available resources of the network among all users this leaded to the demand of constructing and designing high efficient scheduling algorithms in order to improve the network utilization, to increase the network throughput, and to minimize the end-to-end delay. In this study, the proposed algorithm focuses on an efficient mechanism to serve non-real time traffic in congested networks by considering channel status.

Keywords: WiMAX, Quality of Services (QoS), OPNE, Diff-Serv (DS).

Procedia PDF Downloads 286
2673 Repository Blockchain for Collaborative Blockchain Ecosystem

Authors: Razwan Ahmed Tanvir, Greg Speegle

Abstract:

Collaborative blockchain ecosystems allow diverse groups to cooperate on tasks while providing properties such as decentralization and transaction security. We provide a model that uses a repository blockchain to manage hard forks within a collaborative system such that a single process (assuming that it has knowledge of the requirements of each fork) can access all of the blocks within the system. The repository blockchain replaces the need for Inter Blockchain Communication (IBC) within the ecosystem by navigating the networks. The resulting construction resembles a tree instead of a chain. A proof-of-concept implementation performs a depth-first search on the new structure.

Keywords: hard fork, shared governance, inter blockchain communication, blockchain ecosystem, regular research paper

Procedia PDF Downloads 17
2672 Study and Analysis of the Factors Affecting Road Safety Using Decision Tree Algorithms

Authors: Naina Mahajan, Bikram Pal Kaur

Abstract:

The purpose of traffic accident analysis is to find the possible causes of an accident. Road accidents cannot be totally prevented but by suitable traffic engineering and management the accident rate can be reduced to a certain extent. This paper discusses the classification techniques C4.5 and ID3 using the WEKA Data mining tool. These techniques use on the NH (National highway) dataset. With the C4.5 and ID3 technique it gives best results and high accuracy with less computation time and error rate.

Keywords: C4.5, ID3, NH(National highway), WEKA data mining tool

Procedia PDF Downloads 338
2671 Performance in Police Organizations: Approaches from the Literature Review

Authors: Felipe Haleyson Ribeiro dos Santos, Edson Ronaldo Guarido Filho

Abstract:

This article aims to review the literature on performance in police organizations. For that, the inOrdinatio method was adopted, which defines the form of selection and classification of articles. The search was carried out in databases, which resulted in a total of 619 documents that were cataloged and classified with the support of the Mendeley software. The theoretical scope intended here is to identify how performance in police organizations has been studied. After deepening the analysis and focusing on management, it was possible to classify the articles into three levels: individual, organizational, and institutional. However, to our best knowledge, no studies were found that addressed the performance relationship between the levels, which can be seen as a suggestion for further research.

Keywords: police management, performance, management, multi-level

Procedia PDF Downloads 109
2670 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 195
2669 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 314
2668 Learning Difficulties of Children with Disabilities

Authors: Chalise Kiran

Abstract:

The learning difficulties of children with disabilities are always a matter of concern when we talk about educational needs and quality education of children with disabilities. This paper is the outcome of the review of the literatures based on the literatures on the educational needs and learning difficulties of children with disabilities. For the paper, different studies written on children with disabilities and their education were collected through search engines. The literature put together was analyzed from the angle of learning difficulties faced by children with disabilities and the same were used as a precursor to arrive at the findings on the learning of the children. The analysis showed that children with disabilities face learning difficulties. The reasons for these difficulties could be attributed to factors in terms of authority, structure, school environment, and behaviors of teachers and parents, and the society as a whole.

Keywords: children with disabilities, learning difficulties, education, disabled children

Procedia PDF Downloads 114
2667 Sponsorship Strategy, Its Visibility, and Return: A Case Study on Brazilian Olympic Games

Authors: Elizabeth F. Rodrigues, Julia da R. Mattos, Naira Q. Leitão, Roberta T. da Cunha

Abstract:

The business strategy of many companies has two factors in common: the search for the competitive edge and its long term maintenance. The thing that differentiates the companies’ performance in their abilities to set the right strategy, which depends on their capacity to analyze and apply all sort of management support tools. In this context, the sponsorship of events stands out as an important way to increase brand awareness, especially when it is a worldwide event, such as Rio 2016 Olympic and Paralympic Games. This paper will present the case of a car maker company, which chose to invest on sponsorship as a way to reach its goals and grow in the brazilian market.

Keywords: strategy, sponsorship, events, management

Procedia PDF Downloads 497
2666 Unsupervised Part-of-Speech Tagging for Amharic Using K-Means Clustering

Authors: Zelalem Fantahun

Abstract:

Part-of-speech tagging is the process of assigning a part-of-speech or other lexical class marker to each word into naturally occurring text. Part-of-speech tagging is the most fundamental and basic task almost in all natural language processing. In natural language processing, the problem of providing large amount of manually annotated data is a knowledge acquisition bottleneck. Since, Amharic is one of under-resourced language, the availability of tagged corpus is the bottleneck problem for natural language processing especially for POS tagging. A promising direction to tackle this problem is to provide a system that does not require manually tagged data. In unsupervised learning, the learner is not provided with classifications. Unsupervised algorithms seek out similarity between pieces of data in order to determine whether they can be characterized as forming a group. This paper explicates the development of unsupervised part-of-speech tagger using K-Means clustering for Amharic language since large amount of data is produced in day-to-day activities. In the development of the tagger, the following procedures are followed. First, the unlabeled data (raw text) is divided into 10 folds and tokenization phase takes place; at this level, the raw text is chunked at sentence level and then into words. The second phase is feature extraction which includes word frequency, syntactic and morphological features of a word. The third phase is clustering. Among different clustering algorithms, K-means is selected and implemented in this study that brings group of similar words together. The fourth phase is mapping, which deals with looking at each cluster carefully and the most common tag is assigned to a group. This study finds out two features that are capable of distinguishing one part-of-speech from others these are morphological feature and positional information and show that it is possible to use unsupervised learning for Amharic POS tagging. In order to increase performance of the unsupervised part-of-speech tagger, there is a need to incorporate other features that are not included in this study, such as semantic related information. Finally, based on experimental result, the performance of the system achieves a maximum of 81% accuracy.

Keywords: POS tagging, Amharic, unsupervised learning, k-means

Procedia PDF Downloads 451
2665 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 164
2664 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 130