Search results for: data mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29145

Search results for: data mining techniques

28035 Art Street as a Way for Reflective Thinking in the Filed of Adult and Primary Education: Examples of Educational Techniques

Authors: Georgia H. Mega

Abstract:

Art street, a category of artwork displayed in public spaces, has been recognized as a potential tool for promoting reflective thinking in both adult and primary education. Educational techniques that encourage critical and creative thinking, as well as deeper reflection, have been developed and applied in educational curricula. This paper aims to explore the potential of art street in cultivating learners' reflective awareness toward multiculturalism. The main objective of this case study is to investigate the possibilities that art street offers in terms of developing learners' critical reflection, regardless of their age. The study compares two art street works from Greece and Norway, focusing on their common theme of multiculturalism. The study adopts a qualitative methodology, specifically a case study approach. This approach allows for an in-depth analysis of the two selected art street works and their impact on learners' reflective thinking. The study demonstrates that art street can effectively cultivate learners' reflective awareness of multiculturalism. The selected works of art, despite being created by different artists and displayed in different cities, share similar content and convey messages that facilitate reflective dialogue on cultural osmosis. Both adult and primary education approaches utilize the same art street works to achieve reflective awareness. This paper contributes to the existing literature on reflective learning processes by highlighting the potential of art street as a means for encouraging reflective thinking. It builds upon the theoretical frameworks of adult education theorists such as Freire and Mezirow, as well as those of primary education theorists such as Perkins and Project Zero. Data for this study were collected through observation and analysis of two art street works, one from Greece and one from Norway. These works were selected based on their common theme of multiculturalism. Analysis Procedures: The collected data were analyzed using qualitative analysis techniques. The researchers examined the content and messages conveyed by the selected art street works and explored their impact on learners' reflective thinking. The central question addressed in this study is whether art street can develop learners' critical reflection toward multiculturalism, regardless of their age. The findings of this study support the notion that art street can effectively cultivate learners' reflective awareness toward multiculturalism. The selected art street works, despite their differences in origin and location, share common themes that encourage reflective dialogue. The use of art street in both adult and primary education approaches showcases its potential as a tool for promoting reflective learning processes. Overall, this paper contributes to the understanding of art street as a means for reflective thinking in the field of adult and primary education.

Keywords: art street, educational techniques, multiculturalism, observation of artworks, reflective awareness

Procedia PDF Downloads 63
28034 Big Data Analytics and Public Policy: A Study in Rural India

Authors: Vasantha Gouri Prathapagiri

Abstract:

Innovations in ICT sector facilitate qualitative life style for citizens across the globe. Countries that facilitate usage of new techniques in ICT, i.e., big data analytics find it easier to fulfil the needs of their citizens. Big data is characterised by its volume, variety, and speed. Analytics involves its processing in a cost effective way in order to draw conclusion for their useful application. Big data also involves into the field of machine learning, artificial intelligence all leading to accuracy in data presentation useful for public policy making. Hence using data analytics in public policy making is a proper way to march towards all round development of any country. The data driven insights can help the government to take important strategic decisions with regard to socio-economic development of her country. Developed nations like UK and USA are already far ahead on the path of digitization with the support of Big Data analytics. India is a huge country and is currently on the path of massive digitization being realised through Digital India Mission. Internet connection per household is on the rise every year. This transforms into a massive data set that has the potential to improvise the public services delivery system into an effective service mechanism for Indian citizens. In fact, when compared to developed nations, this capacity is being underutilized in India. This is particularly true for administrative system in rural areas. The present paper focuses on the need for big data analytics adaptation in Indian rural administration and its contribution towards development of the country on a faster pace. Results of the research focussed on the need for increasing awareness and serious capacity building of the government personnel working for rural development with regard to big data analytics and its utility for development of the country. Multiple public policies are framed and implemented for rural development yet the results are not as effective as they should be. Big data has a major role to play in this context as can assist in improving both policy making and implementation aiming at all round development of the country.

Keywords: Digital India Mission, public service delivery system, public policy, Indian administration

Procedia PDF Downloads 148
28033 Robust Image Design Based Steganographic System

Authors: Sadiq J. Abou-Loukh, Hanan M. Habbi

Abstract:

This paper presents a steganography to hide the transmitted information without excite suspicious and also illustrates the level of secrecy that can be increased by using cryptography techniques. The proposed system has been implemented firstly by encrypted image file one time pad key and secondly encrypted message that hidden to perform encryption followed by image embedding. Then the new image file will be created from the original image by using four triangles operation, the new image is processed by one of two image processing techniques. The proposed two processing techniques are thresholding and differential predictive coding (DPC). Afterwards, encryption or decryption keys are generated by functional key generator. The generator key is used one time only. Encrypted text will be hidden in the places that are not used for image processing and key generation system has high embedding rate (0.1875 character/pixel) for true color image (24 bit depth).

Keywords: encryption, thresholding, differential predictive coding, four triangles operation

Procedia PDF Downloads 480
28032 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 149
28031 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients

Authors: Soha A. Bahanshal, Byung G. Kim

Abstract:

Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.

Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission

Procedia PDF Downloads 175
28030 An Investigation of Direct and Indirect Geo-Referencing Techniques on the Accuracy of Points in Photogrammetry

Authors: F. Yildiz, S. Y. Oturanc

Abstract:

Advances technology in the field of photogrammetry replaces analog cameras with reflection on aircraft GPS/IMU system with a digital aerial camera. In this system, when determining the position of the camera with the GPS, camera rotations are also determined by the IMU systems. All around the world, digital aerial cameras have been used for the photogrammetry applications in the last ten years. In this way, in terms of the work done in photogrammetry it is possible to use time effectively, costs to be reduced to a minimum level, the opportunity to make fast and accurate. Geo-referencing techniques that are the cornerstone of the GPS / INS systems, photogrammetric triangulation of images required for balancing (interior and exterior orientation) brings flexibility to the process. Also geo-referencing process; needed in the application of photogrammetry targets to help to reduce the number of ground control points. In this study, the use of direct and indirect geo-referencing techniques on the accuracy of the points was investigated in the production of photogrammetric mapping.

Keywords: photogrammetry, GPS/IMU systems, geo-referecing, digital aerial camera

Procedia PDF Downloads 399
28029 Chassis Level Control Using Proportional Integrated Derivative Control, Fuzzy Logic and Deep Learning

Authors: Atakan Aral Ormancı, Tuğçe Arslantaş, Murat Özcü

Abstract:

This study presents the design and implementation of an experimental chassis-level system for various control applications. Specifically, the height level of the chassis is controlled using proportional integrated derivative, fuzzy logic, and deep learning control methods. Real-time data obtained from height and pressure sensors installed in a 6x2 truck chassis, in combination with pulse-width modulation signal values, are utilized during the tests. A prototype pneumatic system of a 6x2 truck is added to the setup, which enables the Smart Pneumatic Actuators to function as if they were in a real-world setting. To obtain real-time signal data from height sensors, an Arduino Nano is utilized, while a Raspberry Pi processes the data using Matlab/Simulink and provides the correct output signals to control the Smart Pneumatic Actuator in the truck chassis. The objective of this research is to optimize the time it takes for the chassis to level down and up under various loads. To achieve this, proportional integrated derivative control, fuzzy logic control, and deep learning techniques are applied to the system. The results show that the deep learning method is superior in optimizing time for a non-linear system. Fuzzy logic control with a triangular membership function as the rule base achieves better outcomes than proportional integrated derivative control. Traditional proportional integrated derivative control improves the time it takes to level the chassis down and up compared to an uncontrolled system. The findings highlight the superiority of deep learning techniques in optimizing the time for a non-linear system, and the potential of fuzzy logic control. The proposed approach and the experimental results provide a valuable contribution to the field of control, automation, and systems engineering.

Keywords: automotive, chassis level control, control systems, pneumatic system control

Procedia PDF Downloads 65
28028 Selection of Appropriate Classification Technique for Lithological Mapping of Gali Jagir Area, Pakistan

Authors: Khunsa Fatima, Umar K. Khattak, Allah Bakhsh Kausar

Abstract:

Satellite images interpretation and analysis assist geologists by providing valuable information about geology and minerals of an area to be surveyed. A test site in Fatejang of district Attock has been studied using Landsat ETM+ and ASTER satellite images for lithological mapping. Five different supervised image classification techniques namely maximum likelihood, parallelepiped, minimum distance to mean, mahalanobis distance and spectral angle mapper have been performed on both satellite data images to find out the suitable classification technique for lithological mapping in the study area. Results of these five image classification techniques were compared with the geological map produced by Geological Survey of Pakistan. The result of maximum likelihood classification technique applied on ASTER satellite image has the highest correlation of 0.66 with the geological map. Field observations and XRD spectra of field samples also verified the results. A lithological map was then prepared based on the maximum likelihood classification of ASTER satellite image.

Keywords: ASTER, Landsat-ETM+, satellite, image classification

Procedia PDF Downloads 381
28027 Case Study on Exploration of Pediatric Cardiopulmonary Resuscitation among Involved Team Members in Pediatric Intensive Care Unit Institut Jantung Negara

Authors: Farah Syazwani Hilmy Zaki

Abstract:

Background: Compared to adult cardiopulmonary resuscitation (CPR), high-quality research and evidence on pediatric CPR remain relatively scarce. This knowledge gap hinders the development of optimal guidelines and best practices for resuscitating children. Objectives: To explore pediatric intensive care unit (PICU) CPR current practices in PICU of Institut Jantung Negara (IJN) Malaysia. Method: The research employed a qualitative approach, utilising case study research design. The data collection process involved in-depth interviews and reviewing the Resuscitation Feedback Form. Purposive sampling was used to select two cases consisting of 14 participants. The study participants comprised a cardiologist, one anaesthetist, and twelve nurses. The data collected were transcribed and entered into NVivo software to facilitate theme development. Subsequently, thematic analysis was conducted to analyse the data. Findings: The study yielded key findings regarding the enhancement of PICU CPR practices. These findings are categorised into four themes, namely routine procedures, resuscitation techniques, team dynamics, and individual contributions. Establishment of cohesive team is crucial in facilitating the effectiveness of resuscitation. According to participants, lack of confidence, skills and knowledge presents significant obstacles to effective PICU CPR. Conclusion: The findings of this study indicate that the participants express satisfaction with the current practices of PICU CPR. However, the research also highlights the need for enhancements in various areas, including routine procedures, resuscitation techniques, as well as team and individual factors. Furthermore, it was suggested that additional training be conducted on the resuscitation process to enhance the preparedness of the medical team.

Keywords: cardiopulmonary resuscitation, feedback, nurses, pediatric intensive care unit

Procedia PDF Downloads 63
28026 Rural Women’s Skill Acquisition in the Processing of Locust Bean in Ipokia Local Government Area of Ogun State, Nigeria

Authors: A. A. Adekunle, A. M. Omoare, W. O. Oyediran

Abstract:

This study was carried out to assess rural women’s skill acquisition in the processing of locust bean in Ipokia Local Government Area of Ogun State, Nigeria. Simple random sampling technique was used to select 90 women locust bean processors for this study. Data were analyzed with descriptive statistics and Pearson Product Moment Correlation. The result showed that the mean age of respondents was 40.72 years. Most (70.00%) of the respondents were married. The mean processing experience was 8.63 years. 93.30% of the respondents relied on information from fellow locust beans processors and friends. All (100%) the respondents did not acquire improved processing skill through trainings and workshops. It can be concluded that the rural women’s skill acquisition on modernized processing techniques was generally low. It is hereby recommend that the rural women processors should be trained by extension service providers through series of workshops and seminars on improved processing techniques.

Keywords: locust bean, processing, skill acquisition, rural women

Procedia PDF Downloads 452
28025 Thick Data Analytics for Learning Cataract Severity: A Triplet Loss Siamese Neural Network Model

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Diagnosing cataract severity is an important factor in deciding to undertake surgery. It is usually conducted by an ophthalmologist or through taking a variety of fundus photography that needs to be examined by the ophthalmologist. This paper carries out an investigation using a Siamese neural net that can be trained with small anchor samples to score cataract severity. The model used in this paper is based on a triplet loss function that takes the ophthalmologist best experience in rating positive and negative anchors to a specific cataract scaling system. This approach that takes the heuristics of the ophthalmologist is generally called the thick data approach, which is a kind of machine learning approach that learn from a few shots. Clinical Relevance: The lens of the eye is mostly made up of water and proteins. A cataract occurs when these proteins at the eye lens start to clump together and block lights causing impair vision. This research aims at employing thick data machine learning techniques to rate the severity of the cataract using Siamese neural network.

Keywords: thick data analytics, siamese neural network, triplet-loss model, few shot learning

Procedia PDF Downloads 99
28024 Electrostatic and Dielectric Measurements for Hair Building Fibers from DC to Microwave Frequencies

Authors: K. Y. You, Y. L. Then

Abstract:

In the recent years, the hair building fiber has become popular, in other words, it is an effective method which helps people who suffer hair loss or sparse hair since the hair building fiber is capable to create a natural look of simulated hair rapidly. In the markets, there are a lot of hair fiber brands that have been designed to formulate an intense bond with hair strands and make the hair appear more voluminous instantly. However, those products have their own set of properties. Thus, in this report, some measurement techniques are proposed to identify those products. Up to five different brands of hair fiber are tested. The electrostatic and dielectric properties of the hair fibers are macroscopically tested using design DC and high-frequency microwave techniques. Besides, the hair fibers are microscopically analysis by magnifying the structures of the fiber using scanning electron microscope (SEM). From the SEM photos, the comparison of the uniformly shaped and broken rate of the hair fibers in the different bulk samples can be observed respectively.

Keywords: hair fiber, electrostatic, dielectric properties, broken rate, microwave techniques

Procedia PDF Downloads 305
28023 Evaluation of SDS (Software Defined Storage) Controller (CorpHD) for Various Storage Demands

Authors: Shreya Bokare, Sanjay Pawar, Shika Nema

Abstract:

Growth in cloud applications is generating the tremendous amount of data, building load on traditional storage management systems. Software Defined Storage (SDS) is a new storage management concept becoming popular to handle this large amount of data. CoprHD is one of the open source SDS controller, available for experimentation and development in the storage industry. In this paper, the storage management techniques provided by CoprHD to manage heterogeneous storage platforms are experimented and analyzed. Various storage management parameters such as time to provision, storage capacity measurement, and heterogeneity are experimentally evaluated along with the theoretical expression to prove the completeness of CoprHD controller for storage management.

Keywords: software defined storage, SDS, CoprHD, open source, SMI-S simulator, clarion, Symmetrix

Procedia PDF Downloads 301
28022 Separating Permanent and Induced Magnetic Signature: A Simple Approach

Authors: O. J. G. Somsen, G. P. M. Wagemakers

Abstract:

Magnetic signature detection provides sensitive detection of metal objects, especially in the natural environment. Our group is developing a tabletop setup for magnetic signatures of various small and model objects. A particular issue is the separation of permanent and induced magnetization. While the latter depends only on the composition and shape of the object, the former also depends on the magnetization history. With common deperming techniques, a significant permanent signature may still remain, which confuses measurements of the induced component. We investigate a basic technique of separating the two. Measurements were done by moving the object along an aluminum rail while the three field components are recorded by a detector attached near the center. This is done first with the rail parallel to the Earth magnetic field and then with anti-parallel orientation. The reversal changes the sign of the induced- but not the permanent magnetization so that the two can be separated. Our preliminary results on a small iron block show excellent reproducibility. A considerable permanent magnetization was indeed present, resulting in a complex asymmetric signature. After separation, a much more symmetric induced signature was obtained that can be studied in detail and compared with theoretical calculations.

Keywords: magnetic signature, data analysis, magnetization, deperming techniques

Procedia PDF Downloads 444
28021 Intrusion Detection Techniques in NaaS in the Cloud: A Review

Authors: Rashid Mahmood

Abstract:

The network as a service (NaaS) usage has been well-known from the last few years in the many applications, like mission critical applications. In the NaaS, prevention method is not adequate as the security concerned, so the detection method should be added to the security issues in NaaS. The authentication and encryption are considered the first solution of the NaaS problem whereas now these are not sufficient as NaaS use is increasing. In this paper, we are going to present the concept of intrusion detection and then survey some of major intrusion detection techniques in NaaS and aim to compare in some important fields.

Keywords: IDS, cloud, naas, detection

Procedia PDF Downloads 306
28020 Use of Technology to Improve Students’ Attitude in Learning Mathematics of Non- Mathematics Undergraduate Students

Authors: Asia Majeed

Abstract:

The learning of mathematics in science, engineering and social science programs can be enhanced through practical problem-solving techniques. The instructors can design their lessons with some strategies to improve students’ educational needs and accomplishments in mathematics classrooms. The use of technology in class problem solving and application sessions can enhance deep understanding of mathematics among students. As mathematician, we believe in subject specific and content-driven teaching methods. Through technology the relationship between the physical problems and the mathematical models can be analyzed. This paper is about selective use of technology in mathematics classrooms and helpful to others mathematics instructors who wishes to improve their traditional teaching techniques to improve students’ attitude in learning mathematics. These techniques corpus can be used in teaching large mathematics classes in science, technology, engineering, and social science.

Keywords: attitude in learning mathematics, mathematics, non-mathematics undergraduate students, technology

Procedia PDF Downloads 203
28019 Signal Processing Techniques for Adaptive Beamforming with Robustness

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.

Keywords: adaptive beamforming, robustness, signal blocking, steering angle error

Procedia PDF Downloads 114
28018 A Research on a Historical Architectural Heritage of the Village: Zriba El Olia

Authors: Yosra Ben Salah, Wang Li Jun, Salem Bellil

Abstract:

The village Hammem Zriba is a lost little paradise in the middle of a beautiful landscape that captures the eyes of every visitor. The village alone is a rich expression of different elements such as urban, architecture, technical and vernacular elements, as well as sociological, spiritual and religious behaviors. This heritage is in degrading conditions and is threatened by disappearing soon; thus, actions have to be taken as soon as possible to preserve this heritage, record, analyze and learn from its traditional ways of construction. The strategy of this study is to examine the architecture within the Berber society over a period of time and influenced by a certain location and its relationship to the social and cultural aspects; this research will focus on historical, environmental, social and cultural aspects influencing architecture. The contents of this paper should mainly be constructed by three successive layouts of historical view, a cultural view and an architectural view that will include the urban and domestic scale. This research relies on the integration of both theoretical and empirical investigations. On the theoretical level: A documentary analysis of secondary data is used. Documentary analysis means content analysis of the relevant documents that include books, journals, magazines, archival data, and field survey and observations. On the empirical level: analysis of these traditional ways of planning and house building will be carried out. Through the Analysis, three techniques will be employed to collect primary data. These techniques are; systematic analysis of the architectural drawings, quantitative analysis to the houses statistics, and a direct observation. Through this research, the technical, architectural and urban achievements of the Berber people who represent a part of the general history and architectural history will be emphasized. And on a second point the potential for the sustainability present in this traditional urban planning and housing to be used to formulate guidelines for modern urban and housing development.

Keywords: culture, history, traditional architecture, values

Procedia PDF Downloads 149
28017 Comparison between Photogrammetric and Structure from Motion Techniques in Processing Unmanned Aerial Vehicles Imageries

Authors: Ahmed Elaksher

Abstract:

Over the last few years, significant progresses have been made and new approaches have been proposed for efficient collection of 3D spatial data from Unmanned aerial vehicles (UAVs) with reduced costs compared to imagery from satellite or manned aircraft. In these systems, a low-cost GPS unit provides the position, velocity of the vehicle, a low-quality inertial measurement unit (IMU) determines its orientation, and off-the-shelf cameras capture the images. Structure from Motion (SfM) and photogrammetry are the main tools for 3D surface reconstruction from images collected by these systems. Unlike traditional techniques, SfM allows the computation of calibration parameters using point correspondences across images without performing a rigorous laboratory or field calibration process and it is more flexible in that it does not require consistent image overlap or same rotation angles between successive photos. These benefits make SfM ideal for UAVs aerial mapping. In this paper, a direct comparison between SfM Digital Elevation Models (DEM) and those generated through traditional photogrammetric techniques was performed. Data was collected by a 3DR IRIS+ Quadcopter with a Canon PowerShot S100 digital camera. Twenty ground control points were randomly distributed on the ground and surveyed with a total station in a local coordinate system. Images were collected from an altitude of 30 meters with a ground resolution of nine mm/pixel. Data was processed with PhotoScan, VisualSFM, Imagine Photogrammetry, and a photogrammetric algorithm developed by the author. The algorithm starts with performing a laboratory camera calibration then the acquired imagery undergoes an orientation procedure to determine the cameras’ positions and orientations. After the orientation is attained, correlation based image matching is conducted to automatically generate three-dimensional surface models followed by a refining step using sub-pixel image information for high matching accuracy. Tests with different number and configurations of the control points were conducted. Camera calibration parameters estimated from commercial software and those obtained with laboratory procedures were comparable. Exposure station positions were within less than few centimeters and insignificant differences, within less than three seconds, among orientation angles were found. DEM differencing was performed between generated DEMs and few centimeters vertical shifts were found.

Keywords: UAV, photogrammetry, SfM, DEM

Procedia PDF Downloads 277
28016 Regional Disparities in the Level of Education in West Bengal

Authors: Nafisa Banu

Abstract:

The present study is an attempt to analyze the regional disparities in the level of education in West Bengal. The data based on secondary sources obtained from a census of India. The study is divided into four sections. The first section presents introductions, objectives and brief descriptions of the study area, second part discuss the methodology and data base, while third and fourth comprise the empirical results, interpretation, and conclusion respectively. For showing the level of educational development, 8 indicators have been selected and Z- score and composite score techniques have been applied. The present study finds out there are large variations of educational level due to various historical, economical, socio-cultural factors of the study area.

Keywords: education, regional disparity, literacy rate, Z-score, composite score

Procedia PDF Downloads 342
28015 An Approach to Building a Recommendation Engine for Travel Applications Using Genetic Algorithms and Neural Networks

Authors: Adrian Ionita, Ana-Maria Ghimes

Abstract:

The lack of features, design and the lack of promoting an integrated booking application are some of the reasons why most online travel platforms only offer automation of old booking processes, being limited to the integration of a smaller number of services without addressing the user experience. This paper represents a practical study on how to improve travel applications creating user-profiles through data-mining based on neural networks and genetic algorithms. Choices made by users and their ‘friends’ in the ‘social’ network context can be considered input data for a recommendation engine. The purpose of using these algorithms and this design is to improve user experience and to deliver more features to the users. The paper aims to highlight a broader range of improvements that could be applied to travel applications in terms of design and service integration, while the main scientific approach remains the technical implementation of the neural network solution. The motivation of the technologies used is also related to the initiative of some online booking providers that have made the fact that they use some ‘neural network’ related designs public. These companies use similar Big-Data technologies to provide recommendations for hotels, restaurants, and cinemas with a neural network based recommendation engine for building a user ‘DNA profile’. This implementation of the ‘profile’ a collection of neural networks trained from previous user choices, can improve the usability and design of any type of application.

Keywords: artificial intelligence, big data, cloud computing, DNA profile, genetic algorithms, machine learning, neural networks, optimization, recommendation system, user profiling

Procedia PDF Downloads 153
28014 Government Big Data Ecosystem: A Systematic Literature Review

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.

Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review

Procedia PDF Downloads 213
28013 Effect of Planting Techniques on Mangrove Seedling Establishment in Kuwait Bay

Authors: L. Al-Mulla, B. M. Thomas, N. R. Bhat, M. K. Suleiman, P. George

Abstract:

Mangroves are halophytic shrubs habituated in the intertidal zones in the tropics and subtropics, forming a complex and highly dynamic coastal ecosystem. Historical evidence indicating the existence followed by the extinction of mangrove in Kuwait; hence, continuous projects have been established to reintroduce this plant to the marine ecosystem. One of the major challenges in establishing large-scale mangrove plantations in Kuwait is the very high rate of seedling mortality, which should ideally be less than 20%. This study was conducted at three selected locations in the Kuwait bay during 2016-2017, to evaluate the effect of four planting techniques on mangrove seedling establishment. Coir-pillow planting technique, comp-mat planting technique, and anchored container planting technique were compared with the conventional planting method. The study revealed that the planting techniques significantly affected the establishment of mangrove seedlings in the initial stages of growth. Location-specific difference in seedling establishment was also observed during the course of the study. However, irrespective of the planting techniques employed, high seedling mortality was observed in all the planting locations towards the end of the study; which may be attributed to the physicochemical characteristics of the mudflats selected.

Keywords: Avicennia marina (Forsk.) Vierh, coastal pollution, heavy metal accumulation, marine ecosystem, sedimentation, tidal inundation

Procedia PDF Downloads 146
28012 Validation of Mapping Historical Linked Data to International Committee for Documentation (CIDOC) Conceptual Reference Model Using Shapes Constraint Language

Authors: Ghazal Faraj, András Micsik

Abstract:

Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) language, provides well-defined shapes and RDF graphs, named "shape graphs". These shape graphs validate other resource description framework (RDF) graphs which are called "data graphs". The structural features of SHACL permit generating a variety of conditions to evaluate string matching patterns, value type, and other constraints. Moreover, the framework of SHACL supports high-level validation by expressing more complex conditions in languages such as SPARQL protocol and RDF Query Language (SPARQL). SHACL includes two parts: SHACL Core and SHACL-SPARQL. SHACL Core includes all shapes that cover the most frequent constraint components. While SHACL-SPARQL is an extension that allows SHACL to express more complex customized constraints. Validating the efficacy of dataset mapping is an essential component of reconciled data mechanisms, as the enhancement of different datasets linking is a sustainable process. The conventional validation methods are the semantic reasoner and SPARQL queries. The former checks formalization errors and data type inconsistency, while the latter validates the data contradiction. After executing SPARQL queries, the retrieved information needs to be checked manually by an expert. However, this methodology is time-consuming and inaccurate as it does not test the mapping model comprehensively. Therefore, there is a serious need to expose a new methodology that covers the entire validation aspects for linking and mapping diverse datasets. Our goal is to conduct a new approach to achieve optimal validation outcomes. The first step towards this goal is implementing SHACL to validate the mapping between the International Committee for Documentation (CIDOC) conceptual reference model (CRM) and one of its ontologies. To initiate this project successfully, a thorough understanding of both source and target ontologies was required. Subsequently, the proper environment to run SHACL and its shape graphs were determined. As a case study, we performed SHACL over a CIDOC-CRM dataset after running a Pellet reasoner via the Protégé program. The applied validation falls under multiple categories: a) data type validation which constrains whether the source data is mapped to the correct data type. For instance, checking whether a birthdate is assigned to xsd:datetime and linked to Person entity via crm:P82a_begin_of_the_begin property. b) Data integrity validation which detects inconsistent data. For instance, inspecting whether a person's birthdate occurred before any of the linked event creation dates. The expected results of our work are: 1) highlighting validation techniques and categories, 2) selecting the most suitable techniques for those various categories of validation tasks. The next plan is to establish a comprehensive validation model and generate SHACL shapes automatically.

Keywords: SHACL, CIDOC-CRM, SPARQL, validation of ontology mapping

Procedia PDF Downloads 243
28011 A Machine Learning Decision Support Framework for Industrial Engineering Purposes

Authors: Anli Du Preez, James Bekker

Abstract:

Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.

Keywords: Data analytics, Industrial engineering, Machine learning, Value creation

Procedia PDF Downloads 158
28010 Use of Locally Effective Microorganisms in Conjunction with Biochar to Remediate Mine-Impacted Soils

Authors: Thomas F. Ducey, Kristin M. Trippe, James A. Ippolito, Jeffrey M. Novak, Mark G. Johnson, Gilbert C. Sigua

Abstract:

The Oronogo-Duenweg mining belt –approximately 20 square miles around the Joplin, Missouri area– is a designated United States Environmental Protection Agency Superfund site due to lead-contaminated soil and groundwater by former mining and smelting operations. Over almost a century of mining (from 1848 to the late 1960’s), an estimated ten million tons of cadmium, lead, and zinc containing material have been deposited on approximately 9,000 acres. Sites that have undergone remediation, in which the O, A, and B horizons have been removed along with the lead contamination, the exposed C horizon remains incalcitrant to revegetation efforts. These sites also suffer from poor soil microbial activity, as measured by soil extracellular enzymatic assays, though 16S ribosomal ribonucleic acid (rRNA) indicates that microbial diversity is equal to sites that have avoided mine-related contamination. Soil analysis reveals low soil organic carbon, along with high levels of bio-available zinc, that reflect the poor soil fertility conditions and low microbial activity. Our study looked at the use of several materials to restore and remediate these sites, with the goal of improving soil health. The following materials, and their purposes for incorporation into the study, were as follows: manure-based biochar for the binding of zinc and other heavy metals responsible for phytotoxicity, locally sourced biosolids and compost to incorporate organic carbon into the depleted soils, effective microorganisms harvested from nearby pristine sites to provide a stable community for nutrient cycling in the newly composited 'soil material'. Our results indicate that all four materials used in conjunction result in the greatest benefit to these mine-impacted soils, based on above ground biomass, microbial biomass, and soil enzymatic activities.

Keywords: locally effective microorganisms, biochar, remediation, reclamation

Procedia PDF Downloads 205
28009 Genomics of Aquatic Adaptation

Authors: Agostinho Antunes

Abstract:

The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.

Keywords: comparative genomics, adaptive evolution, bioinformatics, phylogenetics, genome mining

Procedia PDF Downloads 521
28008 Diabetes Diagnosis Model Using Rough Set and K- Nearest Neighbor Classifier

Authors: Usiobaifo Agharese Rosemary, Osaseri Roseline Oghogho

Abstract:

Diabetes is a complex group of disease with a variety of causes; it is a disorder of the body metabolism in the digestion of carbohydrates food. The application of machine learning in the field of medical diagnosis has been the focus of many researchers and the use of recognition and classification model as a decision support tools has help the medical expert in diagnosis of diseases. Considering the large volume of medical data which require special techniques, experience, and high diagnostic skill in the diagnosis of diseases, the application of an artificial intelligent system to assist medical personnel in order to enhance their efficiency and accuracy in diagnosis will be an invaluable tool. In this study will propose a diabetes diagnosis model using rough set and K-nearest Neighbor classifier algorithm. The system consists of two modules: the feature extraction module and predictor module, rough data set is used to preprocess the attributes while K-nearest neighbor classifier is used to classify the given data. The dataset used for this model was taken for University of Benin Teaching Hospital (UBTH) database. Half of the data was used in the training while the other half was used in testing the system. The proposed model was able to achieve over 80% accuracy.

Keywords: classifier algorithm, diabetes, diagnostic model, machine learning

Procedia PDF Downloads 325
28007 Explainable Graph Attention Networks

Authors: David Pham, Yongfeng Zhang

Abstract:

Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.

Keywords: explainable AI, graph attention network, graph neural network, node classification

Procedia PDF Downloads 172
28006 By-Line Analysis of Determinants Insurance Premiums : Evidence from Tunisian Market

Authors: Nadia Sghaier

Abstract:

In this paper, we aim to identify the determinants of the life and non-life insurance premiums of different lines for the case of the Tunisian insurance market over a recent period from 1997 to 2019. The empirical analysis is conducted using the linear cointegration techniques in the panel data framework, which allow both long and short-run relationships. The obtained results show evidence of long-run relationship between premiums, losses, and financial variables (stock market indices and interest rate). Furthermore, we find that the short-run effect of explanatory variables differs across lines. This finding has important implications for insurance tarification and regulation.

Keywords: insurance premiums, lines, Tunisian insurance market, cointegration approach in panel data

Procedia PDF Downloads 184