Search results for: real-time data
23228 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection
Authors: Muhammad Ali
Abstract:
Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection
Procedia PDF Downloads 12523227 Design of Bidirectional Wavelength Division Multiplexing Passive Optical Network in Optisystem Environment
Authors: Ashiq Hussain, Mahwash Hussain, Zeenat Parveen
Abstract:
Now a days the demand for broadband service has increased. Due to which the researchers are trying to find a solution to provide a large amount of service. There is a shortage of bandwidth because of the use of downloading video, voice and data. One of the solutions to overcome this shortage of bandwidth is to provide the communication system with passive optical components. We have increased the data rate in this system. From experimental results we have concluded that the quality factor has increased by adding passive optical networks.Keywords: WDM-PON, optical fiber, BER, Q-factor, eye diagram
Procedia PDF Downloads 50923226 Analyzing the Social, Cultural and Economic Impacts of Indigenous Tourism on the Indigenous Communities: Case Study of the Nubian Community in Egypt
Authors: M. Makary
Abstract:
Indigenous tourism is nowadays one of the fastest growing sections of the tourism industry. Nevertheless, it does not yet receive attention on the agenda of public tourism policies in Egypt; however, there are various tourism initiatives in indigenous areas throughout the country mainly in the Nubia region, which located in Upper Egypt, where most of Egypt's indigenous Nubians are concentrated. Considering indigenous tourism can lead to both positive and negative impacts on the indigenous communities the main aim of this study is to analyze the socio-cultural and economic impacts of the indigenous tourism on the indigenous communities in Egypt: the case study of Nubians. Qualitative and quantitative approaches of data collection were designed and applied in conducting this study. Semi-structured interviews, focus groups, and the observations are the main preliminary data collection techniques used in this study while, the secondary data were sourced from articles, statistics, dissertations, and websites. The research concludes that indigenous tourism offers a strong motivation to save the identity of the indigenous communities and to foster their economic development. However, it also has negative impacts on their society.Keywords: indigenous tourism, sustainable tourism, Indigenous communities, Nubians
Procedia PDF Downloads 24523225 Risk Analysis of Flood Physical Vulnerability in Residential Areas of Mathare Nairobi, Kenya
Authors: James Kinyua Gitonga, Toshio Fujimi
Abstract:
Vulnerability assessment and analysis is essential to solving the degree of damage and loss as a result of natural disasters. Urban flooding causes a major economic loss and casualties, at Mathare residential area in Nairobi, Kenya. High population caused by rural-urban migration, Unemployment, and unplanned urban development are among factors that increase flood vulnerability in Mathare area. This study aims to analyse flood risk physical vulnerabilities in Mathare based on scientific data, research data that includes the Rainfall data, River Mathare discharge rate data, Water runoff data, field survey data and questionnaire survey through sampling of the study area have been used to develop the risk curves. Three structural types of building were identified in the study area, vulnerability and risk curves were made for these three structural types by plotting the relationship between flood depth and damage for each structural type. The results indicate that the structural type with mud wall and mud floor is the most vulnerable building to flooding while the structural type with stone walls and concrete floor is least vulnerable. The vulnerability of building contents is mainly determined by the number of floors, where households with two floors are least vulnerable, and households with a one floor are most vulnerable. Therefore more than 80% of the residential buildings including the property in the building are highly vulnerable to floods consequently exposed to high risk. When estimating the potential casualties/injuries we discovered that the structural types of houses were major determinants where the mud/adobe structural type had casualties of 83.7% while the Masonry structural type had casualties of 10.71% of the people living in these houses. This research concludes that flood awareness, warnings and observing the building codes will enable reduce damage to the structural types of building, deaths and reduce damage to the building contents.Keywords: flood loss, Mathare Nairobi, risk curve analysis, vulnerability
Procedia PDF Downloads 23923224 The Happiness Pulse: A Measure of Individual Wellbeing at a City Scale, Development and Validation
Authors: Rosemary Hiscock, Clive Sabel, David Manley, Sam Wren-Lewis
Abstract:
As part of the Happy City Index Project, Happy City have developed a survey instrument to measure experienced wellbeing: how people are feeling and functioning in their everyday lives. The survey instrument, called the Happiness Pulse, was developed in partnership with the New Economics Foundation (NEF) with the dual aim of collecting citywide wellbeing data and engaging individuals and communities in the measurement and promotion of their own wellbeing. The survey domains and items were selected through a review of the academic literature and a stakeholder engagement process, including local policymakers, community organisations and individuals. The Happiness Pulse was included in the Bristol pilot of the Happy City Index (n=722). The experienced wellbeing items were subjected to factor analysis. A reduced number of items to be included in a revised scale for future data collection were again entered into a factor analysis. These revised factors were tested for reliability and validity. Among items to be included in a revised scale for future data collection three factors emerged: Be, Do and Connect. The Be factor had good reliability, convergent and criterion validity. The Do factor had good discriminant validity. The Connect factor had adequate reliability and good discriminant and criterion validity. Some age, gender and socioeconomic differentiation was found. The properties of a new scale to measure experienced wellbeing, intended for use by municipal authorities, are described. Happiness Pulse data can be combined with local data on wellbeing conditions to determine what matters for peoples wellbeing across a city and why.Keywords: city wellbeing , community wellbeing, engaging individuals and communities, measuring wellbeing and happiness
Procedia PDF Downloads 26123223 Evaluation of Deformation for Deep Excavations in the Greater Vancouver Area Through Case Studies
Authors: Boris Kolev, Matt Kokan, Mohammad Deriszadeh, Farshid Bateni
Abstract:
Due to the increasing demand for real estate and the need for efficient land utilization in Greater Vancouver, developers have been increasingly considering the construction of high-rise structures with multiple below-grade parking. The temporary excavations required to allow for the construction of underground levels have recently reached up to 40 meters in depth. One of the challenges with deep excavations is the prediction of wall displacements and ground settlements due to their effect on the integrity of City utilities, infrastructure, and adjacent buildings. A large database of survey monitoring data has been collected for deep excavations in various soil conditions and shoring systems. The majority of the data collected is for tie-back anchors and shotcrete lagging systems. The data were categorized, analyzed and the results were evaluated to find a relationship between the most dominant parameters controlling the displacement, such as depth of excavation, soil properties, and the tie-back anchor loading and arrangement. For a select number of deep excavations, finite element modeling was considered for analyses. The lateral displacements from the simulation results were compared to the recorded survey monitoring data. The study concludes with a discussion and comparison of the available empirical and numerical modeling methodologies for evaluating lateral displacements in deep excavations.Keywords: deep excavations, lateral displacements, numerical modeling, shoring walls, tieback anchors
Procedia PDF Downloads 18223222 Predicting Stem Borer Density in Maize Using RapidEye Data and Generalized Linear Models
Authors: Elfatih M. Abdel-Rahman, Tobias Landmann, Richard Kyalo, George Ong’amo, Bruno Le Ru
Abstract:
Maize (Zea mays L.) is a major staple food crop in Africa, particularly in the eastern region of the continent. The maize growing area in Africa spans over 25 million ha and 84% of rural households in Africa cultivate maize mainly as a means to generate food and income. Average maize yields in Sub Saharan Africa are 1.4 t/ha as compared to global average of 2.5–3.9 t/ha due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In East Africa, yield losses due to stem borers are currently estimated between 12% to 40% of the total production. The objective of the present study was therefore to predict stem borer larvae density in maize fields using RapidEye reflectance data and generalized linear models (GLMs). RapidEye images were captured for a test site in Kenya (Machakos) in January and in February 2015. Stem borer larva numbers were modeled using GLMs assuming Poisson (Po) and negative binomial (NB) distributions with error with log arithmetic link. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were employed to assess the models performance using a leave one-out cross-validation approach. Results showed that NB models outperformed Po ones in all study sites. RMSE and RPD ranged between 0.95 and 2.70, and between 2.39 and 6.81, respectively. Overall, all models performed similar when used the January and the February image data. We conclude that reflectance data from RapidEye data can be used to estimate stem borer larvae density. The developed models could to improve decision making regarding controlling maize stem borers using various integrated pest management (IPM) protocols.Keywords: maize, stem borers, density, RapidEye, GLM
Procedia PDF Downloads 49723221 Social Construction of Gender: Comparison of Gender Stereotypes among Bureaucrats and Non- Bureaucrats
Authors: Arshad Ali
Abstract:
This study aims to highlight the comparative patterns of social construction of gender among bureaucrats and non-bureaucrats. For the purpose of this study purposive sample of 8 respondents, including both male and female bureaucrats and non-bureaucrats, was collected from Gujranwala and Lahore. The measures for collecting data included an indigenous demographic information sheet and interview protocol related to gender roles, social construction of gender and managerial performance. The collected data was analyzed through the Nvivo version 11 and analysis reveals that there are diverse perceptions regarding male and female stereotyping among bureaucrats and non-bureaucrats, as different kinds of social environments lead to the modification of stereotypes. The research contributes to gender studies, specifically in the context of Pakistani society. There are very few studies available, and empirical data about Gender construction is scanty, so the study provides an impetus for future research. It is suggested that future research explore the phenomenon at a larger scale, including more respondents and another dimension, by keeping in view the socio-economic factors and policies of the government regarding the elimination of gender discrimination in Pakistan.Keywords: social construction, gender, bureaucrats, gender perception
Procedia PDF Downloads 7523220 Secure E-Voting Using Blockchain Technology
Authors: Barkha Ramteke, Sonali Ridhorkar
Abstract:
An election is an important event in all countries. Traditional voting has several drawbacks, including the expense of time and effort required for tallying and counting results, the cost of papers, arrangements, and everything else required to complete a voting process. Many countries are now considering online e-voting systems, but the traditional e-voting systems suffer a lack of trust. It is not known if a vote is counted correctly, tampered or not. A lack of transparency means that the voter has no assurance that his or her vote will be counted as they voted in elections. Electronic voting systems are increasingly using blockchain technology as an underlying storage mechanism to make the voting process more transparent and assure data immutability as blockchain technology grows in popularity. The transparent feature, on the other hand, may reveal critical information about applicants because all system users have the same entitlement to their data. Furthermore, because of blockchain's pseudo-anonymity, voters' privacy will be revealed, and third parties involved in the voting process, such as registration institutions, will be able to tamper with data. To overcome these difficulties, we apply Ethereum smart contracts into blockchain-based voting systems.Keywords: blockchain, AMV chain, electronic voting, decentralized
Procedia PDF Downloads 13823219 The Evaluation and Performance of SSRU Employee’s that Influence the Attitude towards Work, Job Satisfaction and Organization Commitment
Authors: Bella Llego
Abstract:
The purpose of this study was to explain and empirically test the influence of attitude towards work, job satisfaction and organizational commitment of SSRU employee’s evaluation and performance. Data used in this study was primary data which were collected through Organizational Commitment Questionnaire with 1-5 Likert Scale. The respondent of this study was 200 managerial and non-managerial staff of SSRU. The statistics to analyze the data provide the descriptive by the mean, standard deviation and test hypothesis by the use of multiple regression. The result of this study is showed that attitude towards work have positive but not significant effect to job satisfaction and employees evaluation and performance. Different with attitude towards work, the organizations commitment has positive and significant influence on job satisfaction and employee performance at SSRU. It means every improvement in organization’s commitment has a positive effect toward job satisfaction and employee evaluation and performance at SSRU.Keywords: attitude towards work, employee’s evaluation and performance, jobs satisfaction, organization commitment
Procedia PDF Downloads 45423218 Data Rate Based Grouping Scheme for Cooperative Communications in Wireless LANs
Authors: Sunmyeng Kim
Abstract:
IEEE 802.11a/b/g standards provide multiple transmission rates, which can be changed dynamically according to the channel condition.Cooperative communications we reintroduced to improve the overallperformance of wireless LANs with the help of relay nodes with higher transmission rates. The cooperative communications are based on the fact that the transmission is much faster when sending data packets to a destination node through a relay node with higher transmission rate, rather than sending data directly to the destination node at low transmission rate. To apply the cooperative communications in wireless LAN, several MAC protocols have been proposed. Some of them can result in collisions among relay nodes in a dense network. In order to solve this problem, we propose a new protocol. Relay nodes are grouped based on their transmission rates. And then, relay nodes only in the highest group try to get channel access. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and collision probability.Keywords: cooperative communications, MAC protocol, relay node, WLAN
Procedia PDF Downloads 46623217 Investigating the Relationship Between Corporate Governance and Financial Performance Considering the Moderating Role of Opinion and Internal Control Weakness
Authors: Fatemeh Norouzi
Abstract:
Today, financial performance has become one of the important issues in accounting and auditing that companies and their managers have paid attention to this issue and for this reason to the variables that are influential in this field. One of the things that can affect financial performance is corporate governance, which is examined in this research, although some things such as issues related to auditing can also moderate this relationship; Therefore, this research has been conducted with the aim of investigating the relationship between corporate governance and financial performance with regard to the moderating role of feedback and internal control weakness. The research is practical in terms of purpose, and in terms of method, it has been done in a post-event descriptive manner, in which the data has been analyzed using stock market data. Data collection has been done by using stock exchange data which has been extracted from the website of the Iraqi Stock Exchange, the statistical population of this research is all the companies admitted to the Iraqi Stock Exchange. . The statistical sample in this research is considered from 2014 to 2021, which includes 34 companies. Four different models have been considered for the research hypotheses, which are eight hypotheses, in this research, the analysis has been done using EXCEL and STATA15 software. In this article, collinearity test, integration test ,determination of fixed effects and correlation matrix results, have been used. The research results showed that the first four hypotheses were rejected and the second four hypotheses were confirmed.Keywords: size of the board of directors, duality of the CEO, financial performance, internal control weakness
Procedia PDF Downloads 8823216 Problems of Boolean Reasoning Based Biclustering Parallelization
Authors: Marcin Michalak
Abstract:
Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.Keywords: Boolean reasoning, biclustering, parallelization, prime implicant
Procedia PDF Downloads 12523215 Assessment of Quality of Drinking Water in Residential Houses of Kuwait by Using GIS Method
Authors: Huda Aljabi
Abstract:
The existence of heavy metals similar to cadmium, arsenic, lead and mercury in the drinking water be able to be a threat to public health. The amount of the substances of these heavy metals in drinking water has expected importance. The National Primary Drinking Water Regulations have set limits for the concentrations of these elements in drinking water because of their toxicity. Furthermore, bromate shaped during the disinfection of drinking water by Ozonation can also be a health hazard. The Paper proposed here will concentrate on the compilation of all available data and information on the presence of trace metals and bromate in the drinking water at residential houses distributed over different areas in Kuwait. New data will also be collected through a sampling of drinking water at some of the residential houses present in different areas of Kuwait and their analysis for the contents of trace metals and bromate. The collected data will be presented on maps showing the distribution of these metals and bromate in the drinking water of Kuwait. Correlation among different chemical parameters will also be investigated using the GRAPHER software. This will help both the Ministry of Electricity and Water (MEW) and the Ministry of Health (MOH) in taking corrective measures and also in planning the infrastructure activities for the future.Keywords: bromate, ozonation, GIS, heavy metals
Procedia PDF Downloads 17623214 Comparison of Slope Data between Google Earth and the Digital Terrain Model, for Registration in Car
Authors: André Felipe Gimenez, Flávia Alessandra Ribeiro da Silva, Roberto Saverio Souza Costa
Abstract:
Currently, the rural producer has been facing problems regarding environmental regularization, which is precisely why the CAR (Rural Environmental Registry) was created. CAR is an electronic registry for rural properties with the purpose of assimilating notions about legal reserve areas, permanent preservation areas, areas of limited use, stable areas, forests and remnants of native vegetation, and all rural properties in Brazil. . The objective of this work was to evaluate and compare altimetry and slope data from google Earth with a digital terrain model (MDT) generated by aerophotogrammetry, in three plots of a steep slope, for the purpose of declaration in the CAR (Rural Environmental Registry). The realization of this work is justified in these areas, in which rural landowners have doubts about the reliability of the use of the free software Google Earth to diagnose inclinations greater than 25 degrees, as recommended by federal law 12651/2012. Added to the fact that in the literature, there is a deficiency of this type of study for the purpose of declaration of the CAR. The results showed that when comparing the drone altimetry data with the Google Earth image data, in areas of high slope (above 40% slope), Google underestimated the real values of terrain slope. Thus, it is concluded that Google Earth is not reliable for diagnosing areas with an inclination greater than 25 degrees (46% declivity) for the purpose of declaration in the CAR, being essential to carry out the local topographic survey.Keywords: MDT, drone, RPA, SiCar, photogrammetry
Procedia PDF Downloads 13123213 Understanding and Political Participation in Constitutional Monarchy of Dusit District Residents
Authors: Sudaporn Arundee
Abstract:
The purposes of this research were to study in three areas: (1) to study political understanding and participating of the constitutional monarchy, (2) to study the level of participation. This paper drew upon data collected from 395 Dusit residents by using questionnaire. In addition, a simple random sampling was utilized to collect data. The findings revealed that 94 percent of respondents had a very good understanding of constitution monarchy with a mean of 4.8. However, the respondents overall had a very low level of participation with the mean score of 1.69 and standard deviation of .719.Keywords: political participation, constitutional monarchy, management and social sciences
Procedia PDF Downloads 25123212 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs
Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour
Procedia PDF Downloads 9523211 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization
Authors: Hebberly Ahatlan
Abstract:
The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, IT/OT convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.Keywords: digitalization, IT/OT convergence, semantic interoperability, VPP, energy blockchain
Procedia PDF Downloads 18323210 Dynamic Compensation for Environmental Temperature Variation in the Coolant Refrigeration Cycle as a Means of Increasing Machine-Tool Precision
Authors: Robbie C. Murchison, Ibrahim Küçükdemiral, Andrew Cowell
Abstract:
Thermal effects are the largest source of dimensional error in precision machining, and a major proportion is caused by ambient temperature variation. The use of coolant is a primary means of mitigating these effects, but there has been limited work on coolant temperature control. This research critically explored whether CNC-machine coolant refrigeration systems adapted to actively compensate for ambient temperature variation could increase machining accuracy. Accuracy data were collected from operators’ checklists for a CNC 5-axis mill and statistically reduced to bias and precision metrics for observations of one day over a sample period of 27 days. Temperature data were collected using three USB dataloggers in ambient air, the chiller inflow, and the chiller outflow. The accuracy and temperature data were analysed using Pearson correlation, then the thermodynamics of the system were described using system identification with MATLAB. It was found that 75% of thermal error is reflected in the hot coolant temperature but that this is negligibly dependent on ambient temperature. The effect of the coolant refrigeration process on hot coolant outflow temperature was also found to be negligible. Therefore, the evidence indicated that it would not be beneficial to adapt coolant chillers to compensate for ambient temperature variation. However, it is concluded that hot coolant outflow temperature is a robust and accessible source of thermal error data which could be used for prevention strategy evaluation or as the basis of other thermal error strategies.Keywords: CNC manufacturing, machine-tool, precision machining, thermal error
Procedia PDF Downloads 8923209 The First Transcriptome Assembly of Marama Bean: An African Orphan Crop
Authors: Ethel E. Phiri, Lionel Hartzenberg, Percy Chimwamuromba, Emmanuel Nepolo, Jens Kossmann, James R. Lloyd
Abstract:
Orphan crops are underresearched and underutilized food plant species that have not been categorized as major food crops, but have the potential to be economically and agronomically significant. They have been documented to have the ability to tolerate extreme environmental conditions. However, limited research has been conducted to uncover their potential as food crop species. The New Partnership for Africa’s Development (NEPAD) has classified Marama bean, Tylosema esculentum, as an orphan crop. The plant is one of the 101 African orphan crops that must have their genomes sequenced, assembled, and annotated in the foreseeable future. Marama bean is a perennial leguminous plant that primarily grows in poor, arid soils in southern Africa. The plants produce large tubers that can weigh as much as 200kg. While the foliage provides fodder, the tuber is carbohydrate rich and is a staple food source for rural communities in Namibia. Also, the edible seeds are protein- and oil-rich. Marama Bean plants respond rapidly to increased temperatures and severe water scarcity without extreme consequences. Advances in molecular biology and biotechnology have made it possible to effectively transfer technologies between model- and major crops to orphan crops. In this research, the aim was to assemble the first transcriptomic analysis of Marama Bean RNA-sequence data. Many model plant species have had their genomes sequenced and their transcriptomes assembled. Therefore the availability of transcriptome data for a non-model crop plant species will allow for gene identification and comparisons between various species. The data has been sequenced using the Ilumina Hiseq 2500 sequencing platform. Data analysis is underway. In essence, this research will eventually evaluate the potential use of Marama Bean as a crop species to improve its value in agronomy. data for a non-model crop plant species will allow for gene identification and comparisons between various species. The data has been sequenced using the Ilumina Hiseq 2500 sequencing platform. Data analysis is underway. In essence, this researc will eventually evaluate the potential use of Marama bean as a crop species to improve its value in agronomy.Keywords: 101 African orphan crops, RNA-Seq, Tylosema esculentum, underutilised crop plants
Procedia PDF Downloads 36023208 Sociocultural Foundations of Psychological Well-Being among Ethiopian Adults
Authors: Kassahun Tilahun
Abstract:
Most of the studies available on adult psychological well-being have been centered on Western countries. However, psychological well-being does not have the same meaning across the world. The Euro-American and African conceptions and experiences of psychological well-being differ systematically. As a result, questions like, how do people living in developing African countries, like Ethiopia, report their psychological well-being; what would the context-specific prominent determinants of their psychological well-being be, needs a definitive answer. This study was, therefore, aimed at developing a new theory that would address these socio-cultural issues of psychological well-being. Consequently, data were obtained through interview and open ended questionnaire. A total of 438 adults, working in governmental and non-governmental organizations situated in Addis Ababa, participated in the study. Appropriate qualitative method of data analysis, i.e. thematic content analysis, was employed for analyzing the data. The thematic analysis involves a type of abductive analysis, driven both by theoretical interest and the nature of the data. Reliability and credibility issues were addressed appropriately. The finding identified five major categories of themes, which are viewed as essential in determining the conceptions and experiences of psychological well-being of Ethiopian adults. These were; socio-cultural harmony, social cohesion, security, competence and accomplishment, and the self. Detailed discussion on the rational for including these themes was made and appropriate positive psychology interventions were proposed. Researchers are also encouraged to expand this qualitative research and in turn develop a suitable instrument taping the psychological well-being of adults with different sociocultural orientations.Keywords: sociocultural, psychological, well-being Ethiopia, adults
Procedia PDF Downloads 54623207 Thick Data Techniques for Identifying Abnormality in Video Frames for Wireless Capsule Endoscopy
Authors: Jinan Fiaidhi, Sabah Mohammed, Petros Zezos
Abstract:
Capsule endoscopy (CE) is an established noninvasive diagnostic modality in investigating small bowel disease. CE has a pivotal role in assessing patients with suspected bleeding or identifying evidence of active Crohn's disease in the small bowel. However, CE produces lengthy videos with at least eighty thousand frames, with a frequency rate of 2 frames per second. Gastroenterologists cannot dedicate 8 to 15 hours to reading the CE video frames to arrive at a diagnosis. This is why the issue of analyzing CE videos based on modern artificial intelligence techniques becomes a necessity. However, machine learning, including deep learning, has failed to report robust results because of the lack of large samples to train its neural nets. In this paper, we are describing a thick data approach that learns from a few anchor images. We are using sound datasets like KVASIR and CrohnIPI to filter candidate frames that include interesting anomalies in any CE video. We are identifying candidate frames based on feature extraction to provide representative measures of the anomaly, like the size of the anomaly and the color contrast compared to the image background, and later feed these features to a decision tree that can classify the candidate frames as having a condition like the Crohn's Disease. Our thick data approach reported accuracy of detecting Crohn's Disease based on the availability of ulcer areas at the candidate frames for KVASIR was 89.9% and for the CrohnIPI was 83.3%. We are continuing our research to fine-tune our approach by adding more thick data methods for enhancing diagnosis accuracy.Keywords: thick data analytics, capsule endoscopy, Crohn’s disease, siamese neural network, decision tree
Procedia PDF Downloads 15623206 Landsat Data from Pre Crop Season to Estimate the Area to Be Planted with Summer Crops
Authors: Valdir Moura, Raniele dos Anjos de Souza, Fernando Gomes de Souza, Jose Vagner da Silva, Jerry Adriani Johann
Abstract:
The estimate of the Area of Land to be planted with annual crops and its stratification by the municipality are important variables in crop forecast. Nowadays in Brazil, these information’s are obtained by the Brazilian Institute of Geography and Statistics (IBGE) and published under the report Assessment of the Agricultural Production. Due to the high cloud cover in the main crop growing season (October to March) it is difficult to acquire good orbital images. Thus, one alternative is to work with remote sensing data from dates before the crop growing season. This work presents the use of multitemporal Landsat data gathered on July and September (before the summer growing season) in order to estimate the area of land to be planted with summer crops in an area of São Paulo State, Brazil. Geographic Information Systems (GIS) and digital image processing techniques were applied for the treatment of the available data. Supervised and non-supervised classifications were used for data in digital number and reflectance formats and the multitemporal Normalized Difference Vegetation Index (NDVI) images. The objective was to discriminate the tracts with higher probability to become planted with summer crops. Classification accuracies were evaluated using a sampling system developed basically for this study region. The estimated areas were corrected using the error matrix derived from these evaluations. The classification techniques presented an excellent level according to the kappa index. The proportion of crops stratified by municipalities was derived by a field work during the crop growing season. These proportion coefficients were applied onto the area of land to be planted with summer crops (derived from Landsat data). Thus, it was possible to derive the area of each summer crop by the municipality. The discrepancies between official statistics and our results were attributed to the sampling and the stratification procedures. Nevertheless, this methodology can be improved in order to provide good crop area estimates using remote sensing data, despite the cloud cover during the growing season.Keywords: area intended for summer culture, estimated area planted, agriculture, Landsat, planting schedule
Procedia PDF Downloads 15023205 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction
Authors: Kefaya Qaddoum
Abstract:
Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.Keywords: tomato yield prediction, naive Bayes, redundancy, WSG
Procedia PDF Downloads 23723204 Effective Training System for Riding Posture Using Depth and Inertial Sensors
Authors: Sangseung Kang, Kyekyung Kim, Suyoung Chi
Abstract:
A good posture is the most important factor in riding. In this paper, we present an effective posture correction system for a riding simulator environment to provide position error detection and customized training functions. The proposed system detects and analyzes the rider's posture using depth data and inertial sensing data. Our experiments show that including these functions will help users improve their seat for a riding.Keywords: posture correction, posture training, riding posture, riding simulator
Procedia PDF Downloads 47623203 An Adaptive Oversampling Technique for Imbalanced Datasets
Authors: Shaukat Ali Shahee, Usha Ananthakumar
Abstract:
A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling
Procedia PDF Downloads 41823202 Assessment of Cardioprotective Effect of Deferiprone on Doxorubicin-Induced Cardiac Toxicity in a Rat Model
Authors: Sadaf Kalhori
Abstract:
Introduction: Doxorubicin (DOX)-induced cardiotoxicity is widely known as the most severe complication of anthracycline-based chemotherapy in patients with cancer. It is unknown whether Deferiprone (DFP), could reduce the severity of DOX-induced cardiotoxicity by inhibiting free radical reactions. Thus, this study was performed to assess the protective effect of Deferiprone on DOX-induced cardiotoxicity in a rat model. Methods: The rats were divided into five groups. Group one was a control group. Group 2 was DOX (2 mg/kg/day, every other day for 12 days), and Group three to five which receiving DOX as in group 2 and DFP 75,100 and 150 mg/kg/day, for 19 days, respectively. DFP was starting 5 days prior to the first DOX injection and two days after the last DOX injection throughout the study. Electrocardiographic and hemodynamic studies, along with histopathological examination, were conducted. In addition, serum sample was taken and total cholesterol, Malone dialdehyde, triglyceride, albumin, AST, ALT, total protein, lactate dehydrogenase, total anti-oxidant and creatine kinase were assessed. Result: Our results showed the normal structure of endocardial, myocardial and pericardial in the control group. Pathologic data such as edema, hyperemia, bleeding, endocarditis, myocarditis and pericarditis, hyaline degeneration, cardiomyocyte necrosis, myofilament degeneration and nuclear chromatin changes were assessed in all groups. In the DOX group, all pathologic data was seen with mean grade of 2±1.25. In the DFP group with a dose of 75 and 100 mg, the mean grade was 1.41± 0.31 and 1±.23, respectively. In DFP group with a dose of 150, the pathologic data showed a milder change in comparison with other groups with e mean grade of 0.45 ±0.19. Most pathologic data in DFP groups showed significant changes in comparison with the DOX group (p < 0.001). Discussion: The results also showed that DFP treatment significantly improved DOX-induced heart damage, structural changes in the myocardium, and ventricular function. Our data confirm that DFP is protective against cardiovascular-related disorders induced by DOX. Clinical studies are needed to be involved to examine these findings in humans.Keywords: cardiomyopathy, deferiprone, doxorubicin, rat
Procedia PDF Downloads 14223201 Assessment of Rainfall Erosivity, Comparison among Methods: Case of Kakheti, Georgia
Authors: Mariam Tsitsagi, Ana Berdzenishvili
Abstract:
Rainfall intensity change is one of the main indicators of climate change. It has a great influence on agriculture as one of the main factors causing soil erosion. Splash and sheet erosion are one of the most prevalence and harmful for agriculture. It is invisible for an eye at first stage, but the process will gradually move to stream cutting erosion. Our study provides the assessment of rainfall erosivity potential with the use of modern research methods in Kakheti region. The region is the major provider of wheat and wine in the country. Kakheti is located in the eastern part of Georgia and characterized quite a variety of natural conditions. The climate is dry subtropical. For assessment of the exact rate of rainfall erosion potential several year data of rainfall with short intervals are needed. Unfortunately, from 250 active metro stations running during the Soviet period only 55 of them are active now and 5 stations in Kakheti region respectively. Since 1936 we had data on rainfall intensity in this region, and rainfall erosive potential is assessed, in some old papers, but since 1990 we have no data about this factor, which in turn is a necessary parameter for determining the rainfall erosivity potential. On the other hand, researchers and local communities suppose that rainfall intensity has been changing and the number of haily days has also been increasing. However, finding a method that will allow us to determine rainfall erosivity potential as accurate as possible in Kakheti region is very important. The study period was divided into three sections: 1936-1963; 1963-1990 and 1990-2015. Rainfall erosivity potential was determined by the scientific literature and old meteorological stations’ data for the first two periods. And it is known that in eastern Georgia, at the boundary between steppe and forest zones, rainfall erosivity in 1963-1990 was 20-75% higher than that in 1936-1963. As for the third period (1990-2015), for which we do not have data of rainfall intensity. There are a variety of studies, where alternative ways of calculating the rainfall erosivity potential based on lack of data are discussed e.g.based on daily rainfall data, average annual rainfall data and the elevation of the area, etc. It should be noted that these methods give us a totally different results in case of different climatic conditions and sometimes huge errors in some cases. Three of the most common methods were selected for our research. Each of them was tested for the first two sections of the study period. According to the outcomes more suitable method for regional climatic conditions was selected, and after that, we determined rainfall erosivity potential for the third section of our study period with use of the most successful method. Outcome data like attribute tables and graphs was specially linked to the database of Kakheti, and appropriate thematic maps were created. The results allowed us to analyze the rainfall erosivity potential changes from 1936 to the present and make the future prospect. We have successfully implemented a method which can also be use for some another region of Georgia.Keywords: erosivity potential, Georgia, GIS, Kakheti, rainfall
Procedia PDF Downloads 22423200 Pragmatic Development of Chinese Sentence Final Particles via Computer-Mediated Communication
Authors: Qiong Li
Abstract:
This study investigated in which condition computer-mediated communication (CMC) could promote pragmatic development. The focal feature included four Chinese sentence final particles (SFPs), a, ya, ba, and ne. They occur frequently in Chinese, and function as mitigators to soften the tone of speech. However, L2 acquisition of SFPs is difficult, suggesting the necessity of additional exposure to or explicit instruction on Chinese SFPs. This study follows this line and aims to explore two research questions: (1) Is CMC combined with data-driven instruction more effective than CMC alone in promoting L2 Chinese learners’ SFP use? (2) How does L2 Chinese learners’ SFP use change over time, as compared to the production of native Chinese speakers? The study involved 19 intermediate-level learners of Chinese enrolled at a private American university. They were randomly assigned to two groups: (1) the control group (N = 10), which was exposed to SFPs through CMC alone, (2) the treatment group (N = 9), which was exposed to SFPs via CMC and data-driven instruction. Learners interacted with native speakers on given topics through text-based CMC over Skype. Both groups went through six 30-minute CMC sessions on a weekly basis, with a one-week interval after the first two CMC sessions and a two-week interval after the second two CMC sessions (nine weeks in total). The treatment group additionally received a data-driven instruction after the first two sessions. Data analysis focused on three indices: token frequency, type frequency, and acceptability of SFP use. Token frequency was operationalized as the raw occurrence of SFPs per clause. Type frequency was the range of SFPs. Acceptability was rated by two native speakers using a rating rubric. The results showed that the treatment group made noticeable progress over time on the three indices. The production of SFPs approximated the native-like level. In contrast, the control group only slightly improved on token frequency. Only certain SFPs (a and ya) reached the native-like use. Potential explanations for the group differences were discussed in two aspects: the property of Chinese SFPs and the role of CMC and data-driven instruction. Though CMC provided the learners with opportunities to notice and observe SFP use, as a feature with low saliency, SFPs were not easily noticed in input. Data-driven instruction in the treatment group directed the learners’ attention to these particles, which facilitated the development.Keywords: computer-mediated communication, data-driven instruction, pragmatic development, second language Chinese, sentence final particles
Procedia PDF Downloads 41823199 Forecasting Cancers Cases in Algeria Using Double Exponential Smoothing Method
Authors: Messis A., Adjebli A., Ayeche R., Talbi M., Tighilet K., Louardiane M.
Abstract:
Cancers are the second cause of death worldwide. Prevalence and incidence of cancers is getting increased by aging and population growth. This study aims to predict and modeling the evolution of breast, Colorectal, Lung, Bladder and Prostate cancers over the period of 2014-2019. In this study, data were analyzed using time series analysis with double exponential smoothing method to forecast the future pattern. To describe and fit the appropriate models, Minitab statistical software version 17 was used. Between 2014 and 2019, the overall trend in the raw number of new cancer cases registered has been increasing over time; the change in observations over time has been increasing. Our forecast model is validated since we have good prediction for the period 2020 and data not available for 2021 and 2022. Time series analysis showed that the double exponential smoothing is an efficient tool to model the future data on the raw number of new cancer cases.Keywords: cancer, time series, prediction, double exponential smoothing
Procedia PDF Downloads 89