Search results for: data mining techniques
25202 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification
Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine
Abstract:
Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.Keywords: convolution, feature extraction, image analysis, validation, precision agriculture
Procedia PDF Downloads 31625201 Geoelectric Survey for Groundwater Potential in Waziri Umaru Federal Polytechnic, Birnin Kebbi, Nigeria
Authors: Ibrahim Mohammed, Suleiman Taofiq, Muhammad Naziru Yahya
Abstract:
Geoelectrical measurements using Schlumberger Vertical Electrical Sounding (VES) method were carried out in Waziri Umaru Federal Polytechnic, Birnin Kebbi, Nigeria, with the aim of determining the groundwater potential in the area. Twelve (12) Vertical Electric Sounding (VES) data were collected using Terrameter (ABEM SAS 300c) and analyzed using computer software (IPI2win), which gives an automatic interpretation of the apparent resistivity. The results of the interpretation of VES data were used in the characterization of three to five geo-electric layers from which the aquifer units were delineated. Data analysis indicated that water bearing formation exists in the third and fourth layers having resistivity range of 312 to 767 Ωm and 9.51 to 681 Ωm, respectively. The thickness of the formation ranges from 14.7 to 41.8 m, while the depth is from 8.22 to 53.7 m. Based on the result obtained from the interpretation of the data, five (5) VES stations were recommended as the most viable locations for groundwater exploration in the study area. The VES stations include VES A4, A5, A6, B1, and B2. The VES results of the entire area indicated that the water bearing formation occurs at maximum depth of 53.7 m at the time of this survey.Keywords: aquifer, depth, groundwater, resistivity, Schlumberger
Procedia PDF Downloads 16625200 The Integration of Patient Health Record Generated from Wearable and Internet of Things Devices into Health Information Exchanges
Authors: Dalvin D. Hill, Hector M. Castro Garcia
Abstract:
A growing number of individuals utilize wearable devices on a daily basis. The usage and functionality of these wearable devices vary from user to user. One popular usage of said devices is to track health-related activities that are typically stored on a device’s memory or uploaded to an account in the cloud; based on the current trend, the data accumulated from the wearable device are stored in a standalone location. In many of these cases, this health related datum is not a factor when considering the holistic view of a user’s health lifestyle or record. This health-related data generated from wearable and Internet of Things (IoT) devices can serve as empirical information to a medical provider, as the standalone data can add value to the holistic health record of a patient. This paper proposes a solution to incorporate the data gathered from these wearable and IoT devices, with that a patient’s Personal Health Record (PHR) stored within the confines of a Health Information Exchange (HIE).Keywords: electronic health record, health information exchanges, internet of things, personal health records, wearable devices, wearables
Procedia PDF Downloads 12825199 System Identification in Presence of Outliers
Authors: Chao Yu, Qing-Guo Wang, Dan Zhang
Abstract:
The outlier detection problem for dynamic systems is formulated as a matrix decomposition problem with low-rank, sparse matrices and further recast as a semidefinite programming (SDP) problem. A fast algorithm is presented to solve the resulting problem while keeping the solution matrix structure and it can greatly reduce the computational cost over the standard interior-point method. The computational burden is further reduced by proper construction of subsets of the raw data without violating low rank property of the involved matrix. The proposed method can make exact detection of outliers in case of no or little noise in output observations. In case of significant noise, a novel approach based on under-sampling with averaging is developed to denoise while retaining the saliency of outliers and so-filtered data enables successful outlier detection with the proposed method while the existing filtering methods fail. Use of recovered “clean” data from the proposed method can give much better parameter estimation compared with that based on the raw data.Keywords: outlier detection, system identification, matrix decomposition, low-rank matrix, sparsity, semidefinite programming, interior-point methods, denoising
Procedia PDF Downloads 30725198 Defining a Reference Architecture for Predictive Maintenance Systems: A Case Study Using the Microsoft Azure IoT-Cloud Components
Authors: Walter Bernhofer, Peter Haber, Tobias Mayer, Manfred Mayr, Markus Ziegler
Abstract:
Current preventive maintenance measures are cost intensive and not efficient. With the available sensor data of state of the art internet of things devices new possibilities of automated data processing emerge. Current advances in data science and in machine learning enable new, so called predictive maintenance technologies, which empower data scientists to forecast possible system failures. The goal of this approach is to cut expenses in preventive maintenance by automating the detection of possible failures and to improve efficiency and quality of maintenance measures. Additionally, a centralization of the sensor data monitoring can be achieved by using this approach. This paper describes the approach of three students to define a reference architecture for a predictive maintenance solution in the internet of things domain with a connected smartphone app for service technicians. The reference architecture is validated by a case study. The case study is implemented with current Microsoft Azure cloud technologies. The results of the case study show that the reference architecture is valid and can be used to achieve a system for predictive maintenance execution with the cloud components of Microsoft Azure. The used concepts are technology platform agnostic and can be reused in many different cloud platforms. The reference architecture is valid and can be used in many use cases, like gas station maintenance, elevator maintenance and many more.Keywords: case study, internet of things, predictive maintenance, reference architecture
Procedia PDF Downloads 25225197 Liquid Unloading of Wells with Scaled Perforation via Batch Foamers
Authors: Erwin Chan, Aravind Subramaniyan, Siti Abdullah Fatehah, Steve Lian Kuling
Abstract:
Foam assisted lift technology is proven across the industry to provide efficient deliquification in gas wells. Such deliquification is typically achieved by delivering the foamer chemical downhole via capillary strings. In highly liquid loaded wells where capillary strings are not readily available, foamer can be delivered via batch injection or bull-heading. The latter techniques differ from the former in that cap strings allow for liquid to be unloaded continuously, whereas foamer batches require that periodic batching be conducted for the liquid to be unloaded. Although batch injection allows for liquid to be unloaded in wells with suitable water to gas (WGR) ratio and condensate to gas (CGR) ratio without well intervention for capillary string installation, this technique comes with its own set of challenges - for foamer to de-liquify liquids, the chemical needs to reach perforation locations where gas bubbling is observed. In highly scaled perforation zones in certain wells, foamer delivered in batches is unable to reach the gas bubbling zone, thus achieving poor lift efficiency. This paper aims to discuss the techniques and challenges for unloading liquid via batch injection in scaled perforation wells X and Y, whose WGR is 6bbl/MMscf, whose scale build-up is observed at the bottom of perforation interval, whose water column is 400 feet, and whose ‘bubbling zone’ is less than 100 feet. Variables such as foamer Z dosage, batching technique, and well flow control valve opening times are manipulated during the duration of the trial to achieve maximum liquid unloading and gas rates. During the field trial, the team has found optimal values between the three aforementioned parameters for best unloading results, in which each cycle’s gas and liquid rates are compared with baselines with similar flowing tubing head pressures (FTHP). It is discovered that amongst other factors, a good agitation technique is a primary determinant for efficient liquid unloading. An average increment of 2MMscf/d against an average production of 4MMscf/d at stable FTHP is recorded during the trial.Keywords: foam, foamer, gas lift, liquid unloading, scale, batch injection
Procedia PDF Downloads 18425196 Assessment of the Effect of Cu and Zn on the Growth of Two Chlorophytic Microalgae
Authors: Medina O. Kadiri, John E. Gabriel
Abstract:
Heavy metals are metallic elements with a relatively high density, at least five times greater compared to water. The sources of heavy metal pollution in the environment include industrial, medical, agricultural, pharmaceutical, domestic effluents, and atmospheric sources, mining, foundries, smelting, and any heavy metal-based operation. Although some heavy metals in trace quantities are required for biological metabolism, their higher concentrations elicit toxicities. Others are distinctly toxic and are of no biological functions. Microalgae are the primary producers of aquatic ecosystems and, therefore, the foundation of the aquatic food chain. A study investigating the effects of copper and zinc on the two chlorophytes-Chlorella vulgaris and Dictyosphaerium pulchellum was done in the laboratory, under different concentrations of 0mg/l, 2mg/l, 4mg/l, 6mg/l, 8mg/l, 10mg/l, and 20mg/l. The growth of the test microalgae was determined every two days for 14 days. The results showed that the effects of the test heavy metals were concentration-dependent. From the two microalgae species tested, Chlorella vulgaris showed appreciable growth up to 8mg/l concentration of zinc. Dictyoshphaerium pulchellum had only minimal growth at different copper concentrations except for 2mg/l, which seemed to have relatively higher growth. The growth of the control was remarkably higher than in other concentrations. Generally, the growth of both test algae was consistently inhibited by heavy metals. Comparatively, copper generally inhibited the growth of both algae than zinc. Chlorella vulgaris can be used for bioremediation of high concentrations of zinc. The potential of many microalgae in heavy metal bioremediation can be explored.Keywords: heavy metals, green algae, microalgae, pollution
Procedia PDF Downloads 19525195 Predictive Maintenance: Machine Condition Real-Time Monitoring and Failure Prediction
Authors: Yan Zhang
Abstract:
Predictive maintenance is a technique to predict when an in-service machine will fail so that maintenance can be planned in advance. Analytics-driven predictive maintenance is gaining increasing attention in many industries such as manufacturing, utilities, aerospace, etc., along with the emerging demand of Internet of Things (IoT) applications and the maturity of technologies that support Big Data storage and processing. This study aims to build an end-to-end analytics solution that includes both real-time machine condition monitoring and machine learning based predictive analytics capabilities. The goal is to showcase a general predictive maintenance solution architecture, which suggests how the data generated from field machines can be collected, transmitted, stored, and analyzed. We use a publicly available aircraft engine run-to-failure dataset to illustrate the streaming analytics component and the batch failure prediction component. We outline the contributions of this study from four aspects. First, we compare the predictive maintenance problems from the view of the traditional reliability centered maintenance field, and from the view of the IoT applications. When evolving to the IoT era, predictive maintenance has shifted its focus from ensuring reliable machine operations to improve production/maintenance efficiency via any maintenance related tasks. It covers a variety of topics, including but not limited to: failure prediction, fault forecasting, failure detection and diagnosis, and recommendation of maintenance actions after failure. Second, we review the state-of-art technologies that enable a machine/device to transmit data all the way through the Cloud for storage and advanced analytics. These technologies vary drastically mainly based on the power source and functionality of the devices. For example, a consumer machine such as an elevator uses completely different data transmission protocols comparing to the sensor units in an environmental sensor network. The former may transfer data into the Cloud via WiFi directly. The latter usually uses radio communication inherent the network, and the data is stored in a staging data node before it can be transmitted into the Cloud when necessary. Third, we illustrate show to formulate a machine learning problem to predict machine fault/failures. By showing a step-by-step process of data labeling, feature engineering, model construction and evaluation, we share following experiences: (1) what are the specific data quality issues that have crucial impact on predictive maintenance use cases; (2) how to train and evaluate a model when training data contains inter-dependent records. Four, we review the tools available to build such a data pipeline that digests the data and produce insights. We show the tools we use including data injection, streaming data processing, machine learning model training, and the tool that coordinates/schedules different jobs. In addition, we show the visualization tool that creates rich data visualizations for both real-time insights and prediction results. To conclude, there are two key takeaways from this study. (1) It summarizes the landscape and challenges of predictive maintenance applications. (2) It takes an example in aerospace with publicly available data to illustrate each component in the proposed data pipeline and showcases how the solution can be deployed as a live demo.Keywords: Internet of Things, machine learning, predictive maintenance, streaming data
Procedia PDF Downloads 38625194 Road Condition Monitoring Using Built-in Vehicle Technology Data, Drones, and Deep Learning
Authors: Judith Mwakalonge, Geophrey Mbatta, Saidi Siuhi, Gurcan Comert, Cuthbert Ruseruka
Abstract:
Transportation agencies worldwide continuously monitor their roads' conditions to minimize road maintenance costs and maintain public safety and rideability quality. Existing methods for carrying out road condition surveys involve manual observations of roads using standard survey forms done by qualified road condition surveyors or engineers either on foot or by vehicle. Automated road condition survey vehicles exist; however, they are very expensive since they require special vehicles equipped with sensors for data collection together with data processing and computing devices. The manual methods are expensive, time-consuming, infrequent, and can hardly provide real-time information for road conditions. This study contributes to this arena by utilizing built-in vehicle technologies, drones, and deep learning to automate road condition surveys while using low-cost technology. A single model is trained to capture flexible pavement distresses (Potholes, Rutting, Cracking, and raveling), thereby providing a more cost-effective and efficient road condition monitoring approach that can also provide real-time road conditions. Additionally, data fusion is employed to enhance the road condition assessment with data from vehicles and drones.Keywords: road conditions, built-in vehicle technology, deep learning, drones
Procedia PDF Downloads 12425193 Enhancing Student Learning Outcomes Using Engineering Design Process: Case Study in Physics Course
Authors: Thien Van Ngo
Abstract:
The engineering design process is a systematic approach to solving problems. It involves identifying a problem, brainstorming solutions, prototyping and testing solutions, and evaluating the results. The engineering design process can be used to teach students how to solve problems in a creative and innovative way. The research aim of this study was to investigate the effectiveness of using the engineering design process to enhance student learning outcomes in a physics course. A mixed research method was used in this study. The quantitative data were collected using a pretest-posttest control group design. The qualitative data were collected using semi-structured interviews. The sample was 150 first-year students in the Department of Mechanical Engineering Technology at Cao Thang Technical College in Vietnam in the 2022-2023 school year. The quantitative data were collected using a pretest-posttest control group design. The pretest was administered to both groups at the beginning of the study. The posttest was administered to both groups at the end of the study. The qualitative data were collected using semi-structured interviews with a sample of eight students in the experimental group. The interviews were conducted after the posttest. The quantitative data were analyzed using independent sample T-tests. The qualitative data were analyzed using thematic analysis. The quantitative data showed that students in the experimental group, who were taught using the engineering design process, had significantly higher post-test scores on physics problem-solving than students in the control group, who were taught using the conventional method. The qualitative data showed that students in the experimental group were more motivated and engaged in the learning process than students in the control group. Students in the experimental group also reported that they found the engineering design process to be a more effective way of learning physics. The findings of this study suggest that the engineering design process can be an effective way of enhancing student learning outcomes in physics courses. The engineering design process engages students in the learning process and helps them to develop problem-solving skills.Keywords: engineering design process, problem-solving, learning outcome of physics, students’ physics competencies, deep learning
Procedia PDF Downloads 6525192 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank
Authors: Jalal Haghighat Monfared, Zahra Akbari
Abstract:
Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.Keywords: business intelligence, business intelligence capability, decision making, decision quality
Procedia PDF Downloads 11225191 On the Volume of Ganglion Cell Stimulation in Visual Prostheses by Finite Element Discretization
Authors: Diego Luján Villarreal
Abstract:
Visual prostheses are designed to repair some eyesight in patients blinded by photoreceptor diseases, such as retinitis pigmentosa (RP) and age-related macular degeneration (AMD). Electrode-to-cell proximity has drawn attention due to its implications on secure single-localized stimulation. Yet, few techniques are available for understanding the relationship between the number of cells activated and the current injection. We propose an answering technique by solving the governing equation for time-dependent electrical currents using finite element discretization to obtain the volume of stimulation.Keywords: visual prosthetic devices, volume for stimulation, FEM discretization, 3D simulation
Procedia PDF Downloads 7325190 Effects of Sensory Integration Techniques in Science Education of Autistic Students
Authors: Joanna Estkowska
Abstract:
Sensory integration methods are very useful and improve daily functioning autistic and mentally disabled children. Autism is a neurobiological disorder that impairs one's ability to communicate with and relate to others as well as their sensory system. Children with autism, even highly functioning kids, can find it difficult to process language with surrounding noise or smells. They are hypersensitive to things we can ignore such as sight, sounds and touch. Adolescents with highly functioning autism or Asperger Syndrome can study Science and Math but the social aspect is difficult for them. Nature science is an area of study that attracts many of these kids. It is a systematic field in which the children can focus on a small aspect. If you follow these rules you can come up with an expected result. Sensory integration program and systematic classroom observation are quantitative methods of measuring classroom functioning and behaviors from direct observations. These methods specify both the events and behaviors that are to be observed and how they are to be recorded. Our students with and without autism attended the lessons in the classroom of nature science in the school and in the laboratory of University of Science and Technology in Bydgoszcz. The aim of this study is investigation the effects of sensory integration methods in teaching to students with autism. They were observed during experimental lessons in the classroom and in the laboratory. Their physical characteristics, sensory dysfunction, and behavior in class were taken into consideration by comparing their similarities and differences. In the chemistry classroom, every autistic student is paired with a mentor from their school. In the laboratory, the children are expected to wear goggles, gloves and a lab coat. The chemistry classes in the laboratory were held for four hours with a lunch break, and according to the assistants, the children were engaged the whole time. In classroom of nature science, the students are encouraged to use the interactive exhibition of chemical, physical and mathematical models constructed by the author of this paper. Our students with and without autism attended the lessons in those laboratories. The teacher's goals are: to assist the child in inhibiting and modulating sensory information and support the child in processing a response to sensory stimulation.Keywords: autism spectrum disorder, science education, sensory integration techniques, student with special educational needs
Procedia PDF Downloads 19225189 Using TRACE and SNAP Codes to Establish the Model of Maanshan PWR for SBO Accident
Authors: B. R. Shen, J. R. Wang, J. H. Yang, S. W. Chen, C. Shih, Y. Chiang, Y. F. Chang, Y. H. Huang
Abstract:
In this research, TRACE code with the interface code-SNAP was used to simulate and analyze the SBO (station blackout) accident which occurred in Maanshan PWR (pressurized water reactor) nuclear power plant (NPP). There are four main steps in this research. First, the SBO accident data of Maanshan NPP were collected. Second, the TRACE/SNAP model of Maanshan NPP was established by using these data. Third, this TRACE/SNAP model was used to perform the simulation and analysis of SBO accident. Finally, the simulation and analysis of SBO with mitigation equipments was performed. The analysis results of TRACE are consistent with the data of Maanshan NPP. The mitigation equipments of Maanshan can maintain the safety of Maanshan in the SBO according to the TRACE predictions.Keywords: pressurized water reactor (PWR), TRACE, station blackout (SBO), Maanshan
Procedia PDF Downloads 19425188 A Comparative and Doctrinal Analysis towards the Investigation of a Right to Be Forgotten in Hong Kong
Authors: Jojo Y. C. Mo
Abstract:
Memories are good. They remind us of people, places and experiences that we cherish. But memories cannot be changed and there may well be memories that we do not want to remember. This is particularly true in relation to information which causes us embarrassment and humiliation or simply because it is private – we all want to erase or delete such information. This desire to delete is recently recognised by the Court of Justice of the European Union in the 2014 case of Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González in which the court ordered Google to remove links to some information about the complainant which he wished to be removed. This so-called ‘right to be forgotten’ received serious attention and significantly, the European Council and the European Parliament enacted the General Data Protection Regulation (GDPR) to provide a more structured and normative framework for implementation of right to be forgotten across the EU. This development in data protection laws will, undoubtedly, have significant impact on companies and co-operations not just within the EU but outside as well. Hong Kong, being one of the world’s leading financial and commercial center as well as one of the first jurisdictions in Asia to implement a comprehensive piece of data protection legislation, is therefore a jurisdiction that is worth looking into. This article/project aims to investigate the following: a) whether there is a right to be forgotten under the existing Hong Kong data protection legislation b) if not, whether such a provision is necessary and why. This article utilises a comparative methodology based on a study of primary and secondary resources, including scholarly articles, government and law commission reports and working papers and relevant international treaties, constitutional documents, case law and legislation. The author will primarily engage literature and case-law review as well as comparative and doctrinal analyses. The completion of this article will provide privacy researchers with more concrete principles and data to conduct further research on privacy and data protection in Hong Kong and internationally and will provide a basis for policy makers in assessing the rationale and need for a right to be forgotten in Hong Kong.Keywords: privacy, right to be forgotten, data protection, Hong Kong
Procedia PDF Downloads 19025187 Damage Assessment Based on Full-Polarimetric Decompositions in the 2017 Colombia Landslide
Authors: Hyeongju Jeon, Yonghyun Kim, Yongil Kim
Abstract:
Synthetic Aperture Radar (SAR) is an effective tool for damage assessment induced by disasters due to its all-weather and night/day acquisition capability. In this paper, the 2017 Colombia landslide was observed using full-polarimetric ALOS/PALSAR-2 data. Polarimetric decompositions, including the Freeman-Durden decomposition and the Cloude decomposition, are utilized to analyze the scattering mechanisms changes before and after-landslide. These analyses are used to detect the damaged areas induced by the landslide. Experimental results validate the efficiency of the full polarimetric SAR data since the damaged areas can be well discriminated. Thus, we can conclude the proposed method using full polarimetric data has great potential for damage assessment of landslides.Keywords: Synthetic Aperture Radar (SAR), polarimetric decomposition, damage assessment, landslide
Procedia PDF Downloads 39125186 Spatial Mental Imagery in Students with Visual Impairments when Learning Literal and Metaphorical Uses of Prepositions in English as a Foreign Language
Authors: Natalia Sáez, Dina Shulfman
Abstract:
There is an important research gap regarding accessible pedagogical techniques for teaching foreign languages to adults with visual impairments. English as a foreign language (EFL), in particular, is needed in many countries to expand occupational opportunities and improve living standards. Within EFL research, teaching and learning prepositions have only recently gained momentum, considering that they constitute one of the most difficult structures to learn in a foreign language and are fundamental for communicating about spatial relations in the world, both on the physical and imaginary levels. Learning to use prepositions would not only facilitate communication when referring to the surrounding tangible environment but also when conveying ideas about abstract topics (e.g., justice, love, society), for which students’ sociocultural knowledge about space could play an important role. By potentiating visually impaired students’ ability to construe mental spatial imagery, this study made efforts to explore pedagogical techniques that cater to their strengths, helping them create new worlds by welcoming and expanding their sociocultural funds of knowledge as they learn to use English prepositions. Fifteen visually impaired adults living in Chile participated in the study. Their first language was Spanish, and they were learning English at the intermediate level of proficiency in an EFL workshop at La Biblioteca Central para Ciegos (The Central Library for the Blind). Within this workshop, a series of activities and interviews were designed and implemented with the intention of uncovering students’ spatial funds of knowledge when learning literal/physical uses of three English prepositions, namely “in,” “at,” and “on”. The activities and interviews also explored whether students used their original spatial funds of knowledge when learning metaphorical uses of these prepositions and if their use of spatial imagery changed throughout the learning activities. Over the course of approximately half a year, it soon became clear that the students construed mental images of space when learning both literal/physical and metaphorical uses of these prepositions. This research could inform a new approach to inclusive language education using pedagogical methods that are relevant and accessible to students with visual impairments.Keywords: EFL, funds of knowledge, prepositions, spatial cognition, visually impaired students
Procedia PDF Downloads 7925185 Using Historical Data for Stock Prediction
Authors: Sofia Stoica
Abstract:
In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.Keywords: finance, machine learning, opening price, stock market
Procedia PDF Downloads 19125184 Cultivation of High-value Patent from the Perspective of Knowledge Diffusion: A Case Study of the Power Semiconductor Field
Authors: Lin Qing
Abstract:
[Objective/Significance] The cultivation of high-value patents is the focus and difficulty of patent work, which is of great significance to the construction of a powerful country with intellectual property rights. This work should not only pay attention to the existing patent applications, but also start from the pre-application to explore the high-value technical solutions as the core of high-value patents. [Methods/processes] Comply with the principle of scientific and technological knowledge diffusion, this study studies the top academic conference papers and their cited patent applications, taking the power semiconductor field as an example, using facts date show the feasibility and rationality of mining technology solutions from high quality research results to foster high value patents, stating the actual benefits of these achievements to the industry, giving patent protection suggestions for Chinese applicants comparative with field situation. [Results/Conclusion] The research shows that the quality of citation applications of ISPSD papers is significantly higher than the field average level, and the ability of Chinese applicants to use patent protection related achievements needs to be improved. This study provides a practical and highly targeted reference idea for patent administrators and researchers, and also makes a positive exploration for the practice of the spirit of breaking the five rules.Keywords: high-value patents cultivation, technical solutions, knowledge diffusion, top academic conference papers, intellectual property information analysis
Procedia PDF Downloads 13025183 Supervised Learning for Cyber Threat Intelligence
Authors: Jihen Bennaceur, Wissem Zouaghi, Ali Mabrouk
Abstract:
The major aim of cyber threat intelligence (CTI) is to provide sophisticated knowledge about cybersecurity threats to ensure internal and external safeguards against modern cyberattacks. Inaccurate, incomplete, outdated, and invaluable threat intelligence is the main problem. Therefore, data analysis based on AI algorithms is one of the emergent solutions to overcome the threat of information-sharing issues. In this paper, we propose a supervised machine learning-based algorithm to improve threat information sharing by providing a sophisticated classification of cyber threats and data. Extensive simulations investigate the accuracy, precision, recall, f1-score, and support overall to validate the designed algorithm and to compare it with several supervised machine learning algorithms.Keywords: threat information sharing, supervised learning, data classification, performance evaluation
Procedia PDF Downloads 15025182 Methodologies, Findings, Discussion, and Limitations in Global, Multi-Lingual Research: We Are All Alone - Chinese Internet Drama
Authors: Patricia Portugal Marques de Carvalho Lourenco
Abstract:
A three-phase methodological multi-lingual path was designed, constructed and carried out using the 2020 Chinese Internet Drama Series We Are All Alone as a case study. Phase one, the backbone of the research, comprised of secondary data analysis, providing the structure on which the next two phases would be built on. Phase one incorporated a Google Scholar and a Baidu Index analysis, Star Network Influence Index and Mydramalist.com top two drama reviews, along with an article written about the drama and scrutiny of Chinese related blogs and websites. Phase two was field research elaborated across Latin Europe, and phase three was social media focused, having into account that perceptions are going to be memory conditioned based on past ideas recall. Overall, research has shown the poor cultural expression of Chinese entertainment in Latin Europe and demonstrated the inexistence of Chinese content in French, Italian, Portuguese and Spanish Business to Consumer retailers; a reflection of their low significance in Latin European markets and the short-life cycle of entertainment products in general, bubble-gum, disposable goods without a mid to long-term effect in consumers lives. The process of conducting comprehensive international research was complex and time-consuming, with data not always available in Mandarin, the researcher’s linguistic deficiency, limited Chinese Cultural Knowledge and cultural equivalence. Despite steps being taken to minimize the international proposed research, theoretical limitations concurrent to Latin Europe and China still occurred. Data accuracy was disputable; sampling, data collection/analysis methods are heterogeneous; ascertaining data requirements and the method of analysis to achieve a construct equivalence was challenging and morose to operationalize. Secondary data was also not often readily available in Mandarin; yet, in spite of the array of limitations, research was done, and results were produced.Keywords: research methodologies, international research, primary data, secondary data, research limitations, online dramas, china, latin europe
Procedia PDF Downloads 6825181 Ischemic Stroke Detection in Computed Tomography Examinations
Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina
Abstract:
Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means
Procedia PDF Downloads 36625180 Assessment of Hepatosteatosis Among Diabetic and Nondiabetic Patients Using Biochemical Parameters and Noninvasive Imaging Techniques
Authors: Tugba Sevinc Gamsiz, Emine Koroglu, Ozcan Keskin
Abstract:
Aim: Nonalcoholic fatty liver disease (NAFLD) is considered the most common chronic liver disease in the general population. The higher mortality and morbidity among NAFLD patients and lack of symptoms makes early detection and management important. In our study, we aimed to evaluate the relationship between noninvasive imaging and biochemical markers in diabetic and nondiabetic patients diagnosed with NAFLD. Materials and Methods: The study was conducted from (September 2017) to (December 2017) on adults admitted to Internal Medicine and Gastroenterology outpatient clinics with hepatic steatosis reported on ultrasound or transient elastography within the last six months that exclude patients with other liver diseases or alcohol abuse. The data were collected and analyzed retrospectively. Number cruncher statistical system (NCSS) 2007 program was used for statistical analysis. Results: 116 patients were included in this study. Diabetic patients compared to nondiabetics had significantly higher Controlled Attenuation Parameter (CAP), Liver Stiffness Measurement (LSM) and fibrosis values. Also, hypertension, hepatomegaly, high BMI, hypertriglyceridemia, hyperglycemia, high A1c, and hyperuricemia were found to be risk factors for NAFLD progression to fibrosis. Advanced fibrosis (F3, F4) was present in 18,6 % of all our patients; 35,8 % of diabetic and 5,7 % of nondiabetic patients diagnosed with hepatic steatosis. Conclusion: Transient elastography is now used in daily clinical practice as an accurate noninvasive tool during follow-up of patients with fatty liver. Early diagnosis of the stage of liver fibrosis improves the monitoring and management of patients, especially in those with metabolic syndrome criteria.Keywords: diabetes, elastography, fatty liver, fibrosis, metabolic syndrome
Procedia PDF Downloads 15225179 Optimizing the Efficiency of Measuring Instruments in Ouagadougou-Burkina Faso
Authors: Moses Emetere, Marvel Akinyemi, S. E. Sanni
Abstract:
At the moment, AERONET or AMMA database shows a large volume of data loss. With only about 47% data set available to the scientist, it is evident that accurate nowcast or forecast cannot be guaranteed. The calibration constants of most radiosonde or weather stations are not compatible with the atmospheric conditions of the West African climate. A dispersion model was developed to incorporate salient mathematical representations like a Unified number. The Unified number was derived to describe the turbulence of the aerosols transport in the frictional layer of the lower atmosphere. Fourteen years data set from Multi-angle Imaging SpectroRadiometer (MISR) was tested using the dispersion model. A yearly estimation of the atmospheric constants over Ouagadougou using the model was obtained with about 87.5% accuracy. It further revealed that the average atmospheric constant for Ouagadougou-Niger is a_1 = 0.626, a_2 = 0.7999 and the tuning constants is n_1 = 0.09835 and n_2 = 0.266. Also, the yearly atmospheric constants affirmed the lower atmosphere of Ouagadougou is very dynamic. Hence, it is recommended that radiosonde and weather station manufacturers should constantly review the atmospheric constant over a geographical location to enable about eighty percent data retrieval.Keywords: aerosols retention, aerosols loading, statistics, analytical technique
Procedia PDF Downloads 31525178 Automatic Near-Infrared Image Colorization Using Synthetic Images
Authors: Yoganathan Karthik, Guhanathan Poravi
Abstract:
Colorizing near-infrared (NIR) images poses unique challenges due to the absence of color information and the nuances in light absorption. In this paper, we present an approach to NIR image colorization utilizing a synthetic dataset generated from visible light images. Our method addresses two major challenges encountered in NIR image colorization: accurately colorizing objects with color variations and avoiding over/under saturation in dimly lit scenes. To tackle these challenges, we propose a Generative Adversarial Network (GAN)-based framework that learns to map NIR images to their corresponding colorized versions. The synthetic dataset ensures diverse color representations, enabling the model to effectively handle objects with varying hues and shades. Furthermore, the GAN architecture facilitates the generation of realistic colorizations while preserving the integrity of dimly lit scenes, thus mitigating issues related to over/under saturation. Experimental results on benchmark NIR image datasets demonstrate the efficacy of our approach in producing high-quality colorizations with improved color accuracy and naturalness. Quantitative evaluations and comparative studies validate the superiority of our method over existing techniques, showcasing its robustness and generalization capability across diverse NIR image scenarios. Our research not only contributes to advancing NIR image colorization but also underscores the importance of synthetic datasets and GANs in addressing domain-specific challenges in image processing tasks. The proposed framework holds promise for various applications in remote sensing, medical imaging, and surveillance where accurate color representation of NIR imagery is crucial for analysis and interpretation.Keywords: computer vision, near-infrared images, automatic image colorization, generative adversarial networks, synthetic data
Procedia PDF Downloads 4425177 Shape Management Method of Large Structure Based on Octree Space Partitioning
Authors: Gichun Cha, Changgil Lee, Seunghee Park
Abstract:
The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning
Procedia PDF Downloads 29725176 Exploiting Kinetic and Kinematic Data to Plot Cyclograms for Managing the Rehabilitation Process of BKAs by Applying Neural Networks
Authors: L. Parisi
Abstract:
Kinematic data wisely correlate vector quantities in space to scalar parameters in time to assess the degree of symmetry between the intact limb and the amputated limb with respect to a normal model derived from the gait of control group participants. Furthermore, these particular data allow a doctor to preliminarily evaluate the usefulness of a certain rehabilitation therapy. Kinetic curves allow the analysis of ground reaction forces (GRFs) to assess the appropriateness of human motion. Electromyography (EMG) allows the analysis of the fundamental lower limb force contributions to quantify the level of gait asymmetry. However, the use of this technological tool is expensive and requires patient’s hospitalization. This research work suggests overcoming the above limitations by applying artificial neural networks.Keywords: kinetics, kinematics, cyclograms, neural networks, transtibial amputation
Procedia PDF Downloads 44325175 Urbanization and Built Environment: Impacts of Squatter Slums on Degeneration of Urban Built Environment, a Case Study of Karachi
Authors: Mansoor Imam, Amber Afshan, Sumbul Mujeeb, Kamran Gill
Abstract:
An investigative approach has been made to study the quality of living prevailing in the squatter slums of Karachi city that is influencing the urbanization trends and environmental degeneration of built environment. The paper identifies the issues and aspects that have directly and indirectly impacted the degeneration owing to inadequate basic infrastructural amenities, substandard housing, overcrowding, poor ventilation in homes and workplaces, and noncompliance with building bye-laws and regulations, etc. Primarily, secondary data has been critically examined and analyzed which was however not limited to census data, demographic / socioeconomic data, official documents and other relevant secondary data were obtained from existing literature and GIS. It is observed that the poor and sub-standard housing / living quality have serious adverse impacts on the environment and the health of city residents. Hence strategies for improving the quality of built environment for sustainable living are mandated. It is, therefore, imperative to check and prevent further degradation and promote harmonious living and sustainable urbanization.Keywords: squatter slums, urbanization, degenerations, living quality, built environment
Procedia PDF Downloads 39325174 Assessment of the Contribution of Geographic Information System Technology in Non Revenue Water: Case Study Dar Es Salaam Water and Sewerage Authority Kawe - Mzimuni Street
Authors: Victor Pesco Kassa
Abstract:
This research deals with the assessment of the contribution of GIS Technology in NRW. This research was conducted at Dar, Kawe Mzimuni Street. The data collection was obtained from existing source which is DAWASA HQ. The interpretation of the data was processed by using ArcGIS software. The data collected from the existing source reveals a good coverage of DAWASA’s water network at Mzimuni Street. Most of residents are connected to the DAWASA’s customer service. Also the collected data revealed that by using GIS DAWASA’s customer Geodatabase has been improved. Through GIS we can prepare customer location map purposely for site surveying also this map will be able to show different type of customer that are connected to DAWASA’s water service. This is a perfect contribution of the GIS Technology to address and manage the problem of NRW in DAWASA. Finally, the study recommends that the same study should be conducted in other DAWASA’s zones such as Temeke, Boko and Bagamoyo not only at Kawe Mzimuni Street. Through this study it is observed that ArcGIS software can offer powerful tools for managing and processing information geographically and in water and sanitation authorities such as DAWASA.Keywords: DAWASA, NRW, Esri, EURA, ArcGIS
Procedia PDF Downloads 8325173 Robustified Asymmetric Logistic Regression Model for Global Fish Stock Assessment
Authors: Osamu Komori, Shinto Eguchi, Hiroshi Okamura, Momoko Ichinokawa
Abstract:
The long time-series data on population assessments are essential for global ecosystem assessment because the temporal change of biomass in such a database reflects the status of global ecosystem properly. However, the available assessment data usually have limited sample sizes and the ratio of populations with low abundance of biomass (collapsed) to those with high abundance (non-collapsed) is highly imbalanced. To allow for the imbalance and uncertainty involved in the ecological data, we propose a binary regression model with mixed effects for inferring ecosystem status through an asymmetric logistic model. In the estimation equation, we observe that the weights for the non-collapsed populations are relatively reduced, which in turn puts more importance on the small number of observations of collapsed populations. Moreover, we extend the asymmetric logistic regression model using propensity score to allow for the sample biases observed in the labeled and unlabeled datasets. It robustified the estimation procedure and improved the model fitting.Keywords: double robust estimation, ecological binary data, mixed effect logistic regression model, propensity score
Procedia PDF Downloads 267