Search results for: big data interpretation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25808

Search results for: big data interpretation

23828 Heritage Landmark of Penang: Segara Ninda, a Mix of Culture

Authors: Normah Sulaiman, Yong Zhi Kang, Nor Hayati Hussain, Abdul Rehman Khalid

Abstract:

Segara Ninda owned by Din Ku Meh, the governor of the province Satul, a Malay man with a big role liaising with Thailand. This mansion is part of the legacy he left behind among other properties in George Town, Penang, besides his family. The island’s geographical location is strategic which has benefitted it through important trade routes for Europe, Middle, East, India, and China in the past. Due to this reasoning, various architectural styles were introduced in Penang; Late Straits Eclectic style is one of the forms of the Colonial Architectural style widely spread as vernacular shophouses in George Town. Segara Ninda is located among the mixture of nouveau-riche, historical and heritage sites at the most important street; Penang Road, which dated back to the late 18th century. This paper examines the strait eclectic style that Segara Ninda encompasses. Acknowledging the mixture of colonial architecture in Georgetown, we argue that the mansion faces challenging issues in conservation processes to be vindicated. This is reflected by analysing the spatial layout, visual elements quality, and its activity through interviews with the occupants of the mansion. The focus will be on the understanding of building form, features, and functions; respecting the architectural spaces and their activity. The methodology applied is to promote our understanding of the mix of culture that the mansion holds through documentation, observation and measuring exercises. This offers a positional interpretation of the mix of culture that the mansion holds. This conservation effort will further contribute exposure to the public and recognize it in the society as its essence is a deficiency character to the existing built environment.

Keywords: eclectic, heritage, spatial organization, culture

Procedia PDF Downloads 180
23827 An Analysis on the Hidden Transcripts and Power: A Cultural Study on Confliction between Mother and Daughter-in-Law in Contemporary Chinese Television Dramas

Authors: Xiaohui Pan

Abstract:

As the most influential media for the dissemination of Chinese culture, films and television dramas have played cognitive orientation in guiding young audience to understand its cultural value. Taking a retrospective overview of the Chinese domestic film and television dramas in the last decade, it is tangible to notice that Westernization has become irresistible force in the presentation of Chinese youth culture, such as the rise of sensibility, publicity of subjectivity, and the resistance to mainstream discourse. However, the process of deconstruction and transition of these film and television works on Western youth culture brought about more comprehensive conflicts and integration rather than providing a panoramic interpretation to young Chinese. Issues of tradition and modernization, oriental and Western, and serious thinking and the spirit of entertainment overwhelmed those Chinese works. This study attempts to examine the mechanism of young Chinese’s resistance, compromise and re-construction in their marriages during the dynamic cultural intergration between traditional Chinese culture and Western culture. To investigate such a mechanism, this study analyzed four Chinese television dramas themed on family ethics to reveal the conflictions between two generations, mother-in-law and daughter-in-law, aiming to identify their strategies of their struggles. Incorporating the theory of Scott's weapons of the weak, this study examines the dynamic model of the struggles content analysis on their hidden language and the power. The finding shows that young Chinese identified their self-awakening during the resistance. The study also finds out that the external factors might have the functions of switching the power from the strong end to the weak end. The finding of this study can provide useful insights for researchers in this area and for those in the process of exploring cultural integration issues.

Keywords: intergration, integration, resistance, youth culture

Procedia PDF Downloads 425
23826 A Human Centered Design of an Exoskeleton Using Multibody Simulation

Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann

Abstract:

Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.

Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation

Procedia PDF Downloads 162
23825 Evaluation of the Nursing Management Course in Undergraduate Nursing Programs of State Universities in Turkey

Authors: Oznur Ispir, Oya Celebi Cakiroglu, Esengul Elibol, Emine Ceribas, Gizem Acikgoz, Hande Yesilbas, Merve Tarhan

Abstract:

This study was conducted to evaluate the academic staff teaching the 'Nursing Management' course in the undergraduate nursing programs of the state universities in Turkey and to assess the current content of the course. Design of the study is descriptive. Population of the study consists of seventy-eight undergraduate nursing programs in the state universities in Turkey. The questionnaire/survey prepared by the researchers was used as a data collection tool. The data were obtained by screening the content of the websites of nursing education programs between March and May 2016. Descriptive statistics were used to analyze the data. The research performed within the study indicated that 58% of the undergraduate nursing programs from which the data were derived were included in the school of health, 81% of the academic staff graduated from the undergraduate nursing programs, 40% worked as a lecturer and 37% specialized in a field other than the nursing. The research also implied that the above-mentioned course was included in 98% of the programs from which it was possible to obtain data. The full name of the course was 'Nursing Management' in 95% of the programs and 98% stated that the course was compulsory. Theory and application hours were 3.13 and 2.91, respectively. Moreover, the content of the course was not shared in 65% of the programs reviewed. This study demonstrated that the experience and expertise of the academic staff teaching the 'Nursing Management' course was not sufficient in the management area, and the schedule and content of the course were not sufficient although many nursing education programs provided the course. Comparison between the curricula of the course revealed significant differences.

Keywords: nursing, nursing management, nursing management course, undergraduate program

Procedia PDF Downloads 358
23824 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 284
23823 Q-Map: Clinical Concept Mining from Clinical Documents

Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala

Abstract:

Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.

Keywords: information retrieval, unified medical language system, syntax based analysis, natural language processing, medical informatics

Procedia PDF Downloads 133
23822 Developing Logistics Indices for Turkey as an an Indicator of Economic Activity

Authors: Gizem İntepe, Eti Mizrahi

Abstract:

Investment and financing decisions are influenced by various economic features. Detailed analysis should be conducted in order to make decisions not only by companies but also by governments. Such analysis can be conducted either at the company level or on a sectoral basis to reduce risks and to maximize profits. Sectoral disaggregation caused by seasonality effects, subventions, data advantages or disadvantages may appear in sectors behaving parallel to BIST (Borsa Istanbul stock exchange) Index. Proposed logistic indices could serve market needs as a decision parameter in sectoral basis and also helps forecasting activities in import export volume changes. Also it is an indicator of logistic activity, which is also a sign of economic mobility at the national level. Publicly available data from “Ministry of Transport, Maritime Affairs and Communications” and “Turkish Statistical Institute” is utilized to obtain five logistics indices namely as; exLogistic, imLogistic, fLogistic, dLogistic and cLogistic index. Then, efficiency and reliability of these indices are tested.

Keywords: economic activity, export trade data, import trade data, logistics indices

Procedia PDF Downloads 337
23821 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics

Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane

Abstract:

Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.

Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing

Procedia PDF Downloads 423
23820 Estimation of Coefficient of Discharge of Side Trapezoidal Labyrinth Weir Using Group Method of Data Handling Technique

Authors: M. A. Ansari, A. Hussain, A. Uddin

Abstract:

A side weir is a flow diversion structure provided in the side wall of a channel to divert water from the main channel to a branch channel. The trapezoidal labyrinth weir is a special type of weir in which crest length of the weir is increased to pass higher discharge. Experimental and numerical studies related to the coefficient of discharge of trapezoidal labyrinth weir in an open channel have been presented in the present study. Group Method of Data Handling (GMDH) with the transfer function of quadratic polynomial has been used to predict the coefficient of discharge for the side trapezoidal labyrinth weir. A new model is developed for coefficient of discharge of labyrinth weir by regression method. Generalized models for predicting the coefficient of discharge for labyrinth weir using Group Method of Data Handling (GMDH) network have also been developed. The prediction based on GMDH model is more satisfactory than those given by traditional regression equations.

Keywords: discharge coefficient, group method of data handling, open channel, side labyrinth weir

Procedia PDF Downloads 160
23819 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 63
23818 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning

Authors: Arun Sanjel, Greg Speegle

Abstract:

Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.

Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC

Procedia PDF Downloads 108
23817 Developmental Psycholinguistic Approach to Conversational Skills - A Continuum of the Sensitivity to Gricean Maxims

Authors: Zsuzsanna Schnell, Francesca Ervas

Abstract:

Background: the experimental pragmatic study confirms a basic tenet in the Relevance theoretical views in language philosophy. It draws up a developmental trajectory of the maxims, revealing the cognitive difficulty of their interpretation, their relative place to each other, and the order they may follow in development. A central claim of the present research is that social-cognitive skills play a significant role in inferential meaning construction. Children passing the False Belief Test are significantly more successful in tasks measuring the recognition of the infringement of conversational maxims. Aims and method: Preschoolers’ conversational skills and pragmatic competence is examined in view of their mentalization skills. In doing so it use a measure of linguistic tasks, containing 5 short scenarios for each Gricean maxim. it measure preschoolers’ ToM performance with a first- and a second order ToM task and compare participants’ ability to recognize the infringement of the Gricean maxims in view of their social cognitive skills. Results: Findings suggest that Theory of Mind has a predictive force of 75% concerning the ability to follow Gricean maxims efficiently. ToM proved to be a significant factor in predicting the group’s performance and success rates in 3 out of 4 maxim infringement recognition tasks: in the Quantity, Relevance and Manner conditions, but not in the Quality trial. Conclusions: the results confirm that children’s communicative competence in social contexts requires the development of higher-order social-cognitive reasoning, and reveal the cognitive effort needed for the recognition of the infringement of each maxim, yielding a continuum of their cognitive difficulty and trajectory of development.

Keywords: maxim infringement recognition, social cognition, Gricean maxims, developmental pragmatics

Procedia PDF Downloads 7
23816 A Novel Technological Approach to Maintaining the Cold Chain during Transportation

Authors: Philip J. Purnell

Abstract:

Innovators propose to use the Internet of Things to solve the problem of maintaining the cold chain during the transport of biopharmaceutical products. Sending a data logger with refrigerated goods is only useful to inform the recipient of the goods that they have either breached the cold chain and are therefore potentially spoiled or that they have not breached it and are therefore assumed to be in good condition. Connecting the data logger to the Internet of Things means that the supply chain manager will be informed in real-time of the exact location and the precise temperature of the material at any point on earth. Readable using a simple online interface, the supply chain manager will watch the progress of their material on a Google map together with accurate and crucially real-time temperature readings. The data logger will also send alarms to the supply chain manager if a cold chain breach becomes imminent allowing them time to contact the transporter and restore the cold chain before the material is affected. This development is expected to save billions of dollars in wasted biologics that currently arrive either spoiled or in an unreliable condition.

Keywords: internet of things, cold chain, data logger, transportation

Procedia PDF Downloads 442
23815 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana

Authors: Lidon Lashley

Abstract:

This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.

Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index

Procedia PDF Downloads 162
23814 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 390
23813 Building an Integrated Relational Database from Swiss Nutrition National Survey and Swiss Health Datasets for Data Mining Purposes

Authors: Ilona Mewes, Helena Jenzer, Farshideh Einsele

Abstract:

Objective: The objective of the study was to integrate two big databases from Swiss nutrition national survey (menuCH) and Swiss health national survey 2012 for data mining purposes. Each database has a demographic base data. An integrated Swiss database is built to later discover critical food consumption patterns linked with lifestyle diseases known to be strongly tied with food consumption. Design: Swiss nutrition national survey (menuCH) with approx. 2000 respondents from two different surveys, one by Phone and the other by questionnaire along with Swiss health national survey 2012 with 21500 respondents were pre-processed, cleaned and finally integrated to a unique relational database. Results: The result of this study is an integrated relational database from the Swiss nutritional and health databases.

Keywords: health informatics, data mining, nutritional and health databases, nutritional and chronical databases

Procedia PDF Downloads 112
23812 Clustering Performance Analysis using New Correlation-Based Cluster Validity Indices

Authors: Nathakhun Wiroonsri

Abstract:

There are various cluster validity measures used for evaluating clustering results. One of the main objectives of using these measures is to seek the optimal unknown number of clusters. Some measures work well for clusters with different densities, sizes and shapes. Yet, one of the weaknesses that those validity measures share is that they sometimes provide only one clear optimal number of clusters. That number is actually unknown and there might be more than one potential sub-optimal option that a user may wish to choose based on different applications. We develop two new cluster validity indices based on a correlation between an actual distance between a pair of data points and a centroid distance of clusters that the two points are located in. Our proposed indices constantly yield several peaks at different numbers of clusters which overcome the weakness previously stated. Furthermore, the introduced correlation can also be used for evaluating the quality of a selected clustering result. Several experiments in different scenarios, including the well-known iris data set and a real-world marketing application, have been conducted to compare the proposed validity indices with several well-known ones.

Keywords: clustering algorithm, cluster validity measure, correlation, data partitions, iris data set, marketing, pattern recognition

Procedia PDF Downloads 103
23811 Analyzing Competitive Advantage of Internet of Things and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic hasnot only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of the normal design, construction, and operation of cities provides a unique opportunity to improve connection between people. Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the researchcontribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create a competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: internet of things, data analytics, smart cities, competitive advantage

Procedia PDF Downloads 94
23810 Social Data-Based Users Profiles' Enrichment

Authors: Amel Hannech, Mehdi Adda, Hamid Mcheick

Abstract:

In this paper, we propose a generic model of user profile integrating several elements that may positively impact the research process. We exploit the classical behavior of users and integrate a delimitation process of their research activities into several research sessions enriched with contextual and temporal information, which allows reflecting the current interests of these users in every period of time and infer data freshness. We argue that the annotation of resources gives more transparency on users' needs. It also strengthens social links among resources and users, and can so increase the scope of the user profile. Based on this idea, we integrate the social tagging practice in order to exploit the social users' behavior to enrich their profiles. These profiles are then integrated into a recommendation system in order to predict the interesting personalized items of users allowing to assist them in their researches and further enrich their profiles. In this recommendation, we provide users new research experiences.

Keywords: user profiles, topical ontology, contextual information, folksonomies, tags' clusters, data freshness, association rules, data recommendation

Procedia PDF Downloads 265
23809 Impact of External Temperature on the Speleothem Growth in the Moravian Karst

Authors: Frantisek Odvarka

Abstract:

Based on the data from the Moravian Karst, the influence of the calcite speleothem growth by selected meteorological factors was evaluated. External temperature was determined as one of the main factors influencing speleothem growth in Moravian Karst. This factor significantly influences the CO₂ concentration in soil/epikarst, and cave atmosphere in the Moravian Karst and significantly contributes to the changes in the CO₂ partial pressure differences between soil/epikarst and cave atmosphere in Moravian Karst, which determines the drip water supersaturation with respect to the calcite and quantity of precipitated calcite in the Moravian Karst cave environment. External air temperatures and cave air temperatures were measured using a COMET S3120 data logger, which can measure temperatures in the range from -30 to +80 °C with an accuracy of ± 0.4 °C. CO₂ concentrations in the cave and soils were measured with a FT A600 CO₂H Ahlborn probe (value range 0 ppmv to 10,000 ppmv, accuracy 1 ppmv), which was connected to the data logger ALMEMO 2290-4, V5 Ahlborn. The soil temperature was measured with a FHA646E1 Ahlborn probe (temperature range -20 to 70 °C, accuracy ± 0.4 °C) connected to an ALMEMO 2290-4 V5 Ahlborn data logger. The airflow velocities into and out of the cave were monitored by a FVA395 TH4 Thermo anemometer (speed range from 0.05 to 2 m s⁻¹, accuracy ± 0.04 m s⁻¹), which was connected to the ALMEMO 2590-4 V5 Ahlborn data logger for recording. The flow was measured in the lower and upper entrance of the Imperial Cave. The data were analyzed in MS Office Excel 2019 and PHREEQC.

Keywords: speleothem growth, carbon dioxide partial pressure, Moravian Karst, external temperature

Procedia PDF Downloads 144
23808 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia PDF Downloads 382
23807 Comparative Study of Greenhouse Locations through Satellite Images and Geographic Information System: Methodological Evaluation in Venezuela

Authors: Maria A. Castillo H., Andrés R. Leandro C.

Abstract:

During the last decades, agricultural productivity in Latin America has increased with precision agriculture and more efficient agricultural technologies. The use of automated systems, satellite images, geographic information systems, and tools for data analysis, and artificial intelligence have contributed to making more effective strategic decisions. Twenty years ago, the state of Mérida, located in the Venezuelan Andes, reported the largest area covered by greenhouses in the country, where certified seeds of potatoes, vegetables, ornamentals, and flowers were produced for export and consumption in the central region of the country. In recent years, it is estimated that production under greenhouses has changed, and the area covered has decreased due to different factors, but there are few historical statistical data in sufficient quantity and quality to support this estimate or to be used for analysis and decision making. The objective of this study is to compare data collected about geoposition, use, and covered areas of the greenhouses in 2007 to data available in 2021, as support for the analysis of the current situation of horticultural production in the main municipalities of the state of Mérida. The document presents the development of the work in the diagnosis and integration of geographic coordinates in GIS and data analysis phases. As a result, an evaluation of the process is made, a dashboard is presented with the most relevant data along with the geographical coordinates integrated into GIS, and an analysis of the obtained information is made. Finally, some recommendations for actions are added, and works that expand the information obtained and its geographical traceability over time are proposed. This study contributes to granting greater certainty in the supporting data for the evaluation of social, environmental, and economic sustainability indicators and to make better decisions according to the sustainable development goals in the area under review. At the same time, the methodology provides improvements to the agricultural data collection process that can be extended to other study areas and crops.

Keywords: greenhouses, geographic information system, protected agriculture, data analysis, Venezuela

Procedia PDF Downloads 93
23806 Modelling Consistency and Change of Social Attitudes in 7 Years of Longitudinal Data

Authors: Paul Campbell, Nicholas Biddle

Abstract:

There is a complex, endogenous relationship between individual circumstances, attitudes, and behaviour. This study uses longitudinal panel data to assess changes in social and political attitudes over a 7-year period. Attitudes are captured with the question 'what is the most important issue facing Australia today', collected at multiple time points in a longitudinal survey of 2200 Australians. Consistency of attitudes, and factors predicting change over time, are assessed. The consistency of responses has methodological implications for data collection, specifically how often such questions ought to be asked of a population. When change in attitude is observed, this study assesses the extent to which individual demographic characteristics, personality traits, and broader societal events predict change.

Keywords: attitudes, longitudinal survey analysis, personality, social values

Procedia PDF Downloads 133
23805 A Generic Middleware to Instantly Sync Intensive Writes of Heterogeneous Massive Data via Internet

Authors: Haitao Yang, Zhenjiang Ruan, Fei Xu, Lanting Xia

Abstract:

Industry data centers often need to sync data changes reliably and instantly from a large-scale of heterogeneous autonomous relational databases accessed via the not-so-reliable Internet, for which a practical universal sync middle of low maintenance and operation costs is most wanted, but developing such a product and adapting it for various scenarios are a very sophisticated and continuous practice. The authors have been devising, applying, and optimizing a generic sync middleware system, named GSMS since 2006, holding the principles or advantages that the middleware must be SyncML-compliant and transparent to data application layer logic, need not refer to implementation details of databases synced, does not rely on host computer operating systems deployed, and its construction is light weighted and hence, of low cost. A series of ultimate experiments with GSMS sync performance were conducted for a persuasive example of a source relational database that underwent a broad range of write loads, say, from one thousand to one million intensive writes within a few minutes. The tests proved that GSMS has achieved an instant sync level of well below a fraction of millisecond per record sync, and GSMS’ smooth performances under ultimate write loads also showed it is feasible and competent.

Keywords: heterogeneous massive data, instantly sync intensive writes, Internet generic middleware design, optimization

Procedia PDF Downloads 122
23804 Building Transparent Supply Chains through Digital Tracing

Authors: Penina Orenstein

Abstract:

In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.

Keywords: data mining, supply chain, empirical research, data mapping

Procedia PDF Downloads 175
23803 Synoptic Analysis of a Heavy Flood in the Province of Sistan-Va-Balouchestan: Iran January 2020

Authors: N. Pegahfar, P. Ghafarian

Abstract:

In this research, the synoptic weather conditions during the heavy flood of 10-12 January 2020 in the Sistan-va-Balouchestan Province of Iran will be analyzed. To this aim, reanalysis data from the National Centers for Environmental Prediction (NCEP) and National Center for Atmospheric Research (NCAR), NCEP Global Forecasting System (GFS) analysis data, measured data from a surface station together with satellite images from the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) have been used from 9 to 12 January 2020. Atmospheric parameters both at the lower troposphere and also at the upper part of that have been used, including absolute vorticity, wind velocity, temperature, geopotential height, relative humidity, and precipitation. Results indicated that both lower-level and upper-level currents were strong. In addition, the transport of a large amount of humidity from the Oman Sea and the Red Sea to the south and southeast of Iran (Sistan-va-Balouchestan Province) led to the vast and unexpected precipitation and then a heavy flood.

Keywords: Sistan-va-Balouchestn Province, heavy flood, synoptic, analysis data

Procedia PDF Downloads 102
23802 Role of Machine Learning in Internet of Things Enabled Smart Cities

Authors: Amit Prakash Singh, Shyamli Singh, Chavi Srivastav

Abstract:

This paper presents the idea of Internet of Thing (IoT) for the infrastructure of smart cities. Internet of Thing has been visualized as a communication prototype that incorporates myriad of digital services. The various component of the smart cities shall be implemented using microprocessor, microcontroller, sensors for network communication and protocols. IoT enabled systems have been devised to support the smart city vision, of which aim is to exploit the currently available precocious communication technologies to support the value-added services for function of the city. Due to volume, variety, and velocity of data, it requires analysis using Big Data concept. This paper presented the various techniques used to analyze big data using machine learning.

Keywords: IoT, smart city, embedded systems, sustainable environment

Procedia PDF Downloads 575
23801 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery

Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong

Abstract:

The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.

Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition

Procedia PDF Downloads 290
23800 Machine Learning Classification of Fused Sentinel-1 and Sentinel-2 Image Data Towards Mapping Fruit Plantations in Highly Heterogenous Landscapes

Authors: Yingisani Chabalala, Elhadi Adam, Khalid Adem Ali

Abstract:

Mapping smallholder fruit plantations using optical data is challenging due to morphological landscape heterogeneity and crop types having overlapped spectral signatures. Furthermore, cloud covers limit the use of optical sensing, especially in subtropical climates where they are persistent. This research assessed the effectiveness of Sentinel-1 (S1) and Sentinel-2 (S2) data for mapping fruit trees and co-existing land-use types by using support vector machine (SVM) and random forest (RF) classifiers independently. These classifiers were also applied to fused data from the two sensors. Feature ranks were extracted using the RF mean decrease accuracy (MDA) and forward variable selection (FVS) to identify optimal spectral windows to classify fruit trees. Based on RF MDA and FVS, the SVM classifier resulted in relatively high classification accuracy with overall accuracy (OA) = 0.91.6% and kappa coefficient = 0.91% when applied to the fused satellite data. Application of SVM to S1, S2, S2 selected variables and S1S2 fusion independently produced OA = 27.64, Kappa coefficient = 0.13%; OA= 87%, Kappa coefficient = 86.89%; OA = 69.33, Kappa coefficient = 69. %; OA = 87.01%, Kappa coefficient = 87%, respectively. Results also indicated that the optimal spectral bands for fruit tree mapping are green (B3) and SWIR_2 (B10) for S2, whereas for S1, the vertical-horizontal (VH) polarization band. Including the textural metrics from the VV channel improved crop discrimination and co-existing land use cover types. The fusion approach proved robust and well-suited for accurate smallholder fruit plantation mapping.

Keywords: smallholder agriculture, fruit trees, data fusion, precision agriculture

Procedia PDF Downloads 54
23799 A Tactic for a Cosmopolitan City Comparison through a Data-Driven Approach: Case of Climate City Networking

Authors: Sombol Mokhles

Abstract:

Tackling climate change requires expanding networking opportunities between a diverse range of cities to accelerate climate actions. Existing climate city networks have limitations in actively engaging “ordinary” cities in networking processes between cities, as they encourage a few powerful cities to be followed by the many “ordinary” cities. To reimagine the networking opportunities between cities beyond global cities, this paper incorporates “cosmopolitan comparison” to expand our knowledge of a diverse range of cities using a data-driven approach. Through a cosmopolitan perspective, a framework is presented on how to utilise large data to expand knowledge of cities beyond global cities to reimagine the existing hierarchical networking practices. The contribution of this framework is beyond urban climate governance but inclusive of different fields which strive for a more inclusive and cosmopolitan comparison attentive to the differences across cities.

Keywords: cosmopolitan city comparison, data-driven approach, climate city networking, urban climate governance

Procedia PDF Downloads 111