Search results for: open data
26798 Numerical Flow Simulation around HSP Propeller in Open Water and behind a Vessel Wake Using RANS CFD Code
Authors: Kadda Boumediene, Mohamed Bouzit
Abstract:
The prediction of the flow around marine propellers and vessel hulls propeller interaction is one of the challenges of Computational fluid dynamics (CFD). The CFD has emerged as a potential tool in recent years and has promising applications. The objective of the current study is to predict the hydrodynamic performances of HSP marine propeller in open water and behind a vessel. The unsteady 3-D flow was modeled numerically along with respectively the K-ω standard and K-ω SST turbulence models for steady and unsteady cases. The hydrodynamic performances such us a torque and thrust coefficients and efficiency show good agreement with the experiment results.Keywords: seiun maru propeller, steady, unstead, CFD, HSP
Procedia PDF Downloads 30526797 Statistic Regression and Open Data Approach for Identifying Economic Indicators That Influence e-Commerce
Authors: Apollinaire Barme, Simon Tamayo, Arthur Gaudron
Abstract:
This paper presents a statistical approach to identify explanatory variables linearly related to e-commerce sales. The proposed methodology allows specifying a regression model in order to quantify the relevance between openly available data (economic and demographic) and national e-commerce sales. The proposed methodology consists in collecting data, preselecting input variables, performing regressions for choosing variables and models, testing and validating. The usefulness of the proposed approach is twofold: on the one hand, it allows identifying the variables that influence e- commerce sales with an accessible approach. And on the other hand, it can be used to model future sales from the input variables. Results show that e-commerce is linearly dependent on 11 economic and demographic indicators.Keywords: e-commerce, statistical modeling, regression, empirical research
Procedia PDF Downloads 22626796 Multi Cloud Storage Systems for Resource Constrained Mobile Devices: Comparison and Analysis
Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta
Abstract:
Cloud storage is a model of online data storage where data is stored in virtualized pool of servers hosted by third parties (CSPs) and located in different geographical locations. Cloud storage revolutionized the way how users access their data online anywhere, anytime and using any device as a tablet, mobile, laptop, etc. A lot of issues as vendor lock-in, frequent service outage, data loss and performance related issues exist in single cloud storage systems. So to evade these issues, the concept of multi cloud storage introduced. There are a lot of multi cloud storage systems exists in the market for mobile devices. In this article, we are providing comparison of four multi cloud storage systems for mobile devices Otixo, Unclouded, Cloud Fuze, and Clouds and evaluate their performance on the basis of CPU usage, battery consumption, time consumption and data usage parameters on three mobile phones Nexus 5, Moto G and Nexus 7 tablet and using Wi-Fi network. Finally, open research challenges and future scope are discussed.Keywords: cloud storage, multi cloud storage, vendor lock-in, mobile devices, mobile cloud computing
Procedia PDF Downloads 40726795 Risks beyond Cyber in IoT Infrastructure and Services
Authors: Mattias Bergstrom
Abstract:
Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.Keywords: IoT, security, infrastructure, SCADA, blockchain, AI
Procedia PDF Downloads 10726794 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals
Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti
Abstract:
Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.Keywords: neuroinformatics, bioinformatics, network tools, brain mapping
Procedia PDF Downloads 18226793 Free and Open Source Software for BIM Workflow of Steel Structure Design
Authors: Danilo Di Donato
Abstract:
The continuous new releases of free and open source software (FOSS) and the high costs of proprietary software -whose monopoly is characterized by closed codes and the low level of implementation and customization of software by end-users- impose a reflection on possible tools that can be chosen and adopted for the design and the representation of new steel constructions. The paper aims to show experimentation carried out to verify the actual potential and the effective applicability of FOSS supports to the BIM modeling of steel structures, particularly considering the goal of a possible workflow in order to achieve high level of development (LOD); allow effective interchange methods between different software. To this end, the examined software packages are those with open source or freeware licenses, in order to evaluate their use in architectural praxis. The test has primarily involved the experimentation of Freecad -the only Open Source software that allows a complete and integrated BIM workflow- and then the results have been compared with those of two proprietary software, Sketchup and TeklaBim Sight, which are released with a free version, but not usable for commercial purposes. The experiments carried out on Open Source, and freeware software was then compared with the outcomes that are obtained by two proprietary software, Sketchup Pro and Tekla Structure which has special modules particularly addressed to the design of steel structures. This evaluation has concerned different comparative criteria, that have been defined on the basis of categories related to the reliability, the efficiency, the potentiality, achievable LOD and user-friendliness of the analyzed software packages. In order to verify the actual outcomes of FOSS BIM for the steel structure projects, these results have been compared with a simulation related to a real case study and carried out with a proprietary software BIM modeling. Therefore, the same design theme, the project of a shelter of public space, has been developed using different software. Therefore the purpose of the contribution is to assess what are the developments and potentialities inherent in FOSS BIM, in order to estimate their effective applicability to professional practice, their limits and new fields of research they propose.Keywords: BIM, steel buildings, FOSS, LOD
Procedia PDF Downloads 17426792 Assessing Vertical Distribution of Soil Organic Carbon Stocks in Westleigh Soil under Shrub Encroached Rangeland, Limpopo Province, South Africa
Authors: Abel L. Masotla, Phesheya E. Dlamini, Vusumuzi E. Mbanjwa
Abstract:
Accurate quantification of the vertical distribution of soil organic carbon (SOC) in relation to land cover transformations, associated with shrub encroachment is crucial because deeper lying horizons have been shown to have greater capacity to sequester SOC. Despite this, in-depth soil carbon dynamics remain poorly understood, especially in arid and semi-arid rangelands. The objective of this study was to quantify and compare the vertical distribution of soil organic carbon stocks (SOCs) in shrub-encroached and open grassland sites. To achieve this, soil samples were collected vertically at 10 cm depth intervals under both sites. The results showed that SOC was on average 19% and 13% greater in the topsoil and subsoil respectively, under shrub-encroached grassland compared to open grassland. In both topsoil and subsoil, lower SOCs were found under shrub-encroached (4.53 kg m⁻² and 3.90 kgm⁻²) relative to open grassland (4.39 kgm⁻² and 3.67 kgm⁻²). These results demonstrate that deeper soil horizon play a critical role in the storage of SOC in savanna grassland.Keywords: savanna grasslands, shrub-encroachment, soil organic carbon, vertical distribution
Procedia PDF Downloads 14026791 Blockchain for IoT Security and Privacy in Healthcare Sector
Authors: Umair Shafique, Hafiz Usman Zia, Fiaz Majeed, Samina Naz, Javeria Ahmed, Maleeha Zainab
Abstract:
The Internet of Things (IoT) has become a hot topic for the last couple of years. This innovative technology has shown promising progress in various areas, and the world has witnessed exponential growth in multiple application domains. Researchers are working to investigate its aptitudes to get the best from it by harnessing its true potential. But at the same time, IoT networks open up a new aspect of vulnerability and physical threats to data integrity, privacy, and confidentiality. It's is due to centralized control, data silos approach for handling information, and a lack of standardization in the IoT networks. As we know, blockchain is a new technology that involves creating secure distributed ledgers to store and communicate data. Some of the benefits include resiliency, integrity, anonymity, decentralization, and autonomous control. The potential for blockchain technology to provide the key to managing and controlling IoT has created a new wave of excitement around the idea of putting that data back into the hands of the end-users. In this manuscript, we have proposed a model that combines blockchain and IoT networks to address potential security and privacy issues in the healthcare domain. Then we try to describe various application areas, challenges, and future directions in the healthcare sector where blockchain platforms merge with IoT networks.Keywords: IoT, blockchain, cryptocurrency, healthcare, consensus, data
Procedia PDF Downloads 18026790 Algorithm for Predicting Cognitive Exertion and Cognitive Fatigue Using a Portable EEG Headset for Concussion Rehabilitation
Authors: Lou J. Pino, Mark Campbell, Matthew J. Kennedy, Ashleigh C. Kennedy
Abstract:
A concussion is complex and nuanced, with cognitive rest being a key component of recovery. Cognitive overexertion during rehabilitation from a concussion is associated with delayed recovery. However, daily living imposes cognitive demands that may be unavoidable and difficult to quantify. Therefore, a portable tool capable of alerting patients before cognitive overexertion occurs could allow patients to maintain their quality of life while preventing symptoms and recovery setbacks. EEG allows for a sensitive measure of cognitive exertion. Clinical 32-lead EEG headsets are not practical for day-to-day concussion rehabilitation management. However, there are now commercially available and affordable portable EEG headsets. Thus, these headsets can potentially be used to continuously monitor cognitive exertion during mental tasks to alert the wearer of overexertion, with the aim of preventing the occurrence of symptoms to speed recovery times. The objective of this study was to test an algorithm for predicting cognitive exertion from EEG data collected from a portable headset. EEG data were acquired from 10 participants (5 males, 5 females). Each participant wore a portable 4 channel EEG headband while completing 10 tasks: rest (eyes closed), rest (eyes open), three levels of the increasing difficulty of logic puzzles, three levels of increasing difficulty in multiplication questions, rest (eyes open), and rest (eyes closed). After each task, the participant was asked to report their perceived level of cognitive exertion using the NASA Task Load Index (TLX). Each participant then completed a second session on a different day. A customized machine learning model was created using data from the first session. The performance of each model was then tested using data from the second session. The mean correlation coefficient between TLX scores and predicted cognitive exertion was 0.75 ± 0.16. The results support the efficacy of the algorithm for predicting cognitive exertion. This demonstrates that the algorithms developed in this study used with portable EEG devices have the potential to aid in the concussion recovery process by monitoring and warning patients of cognitive overexertion. Preventing cognitive overexertion during recovery may reduce the number of symptoms a patient experiences and may help speed the recovery process.Keywords: cognitive activity, EEG, machine learning, personalized recovery
Procedia PDF Downloads 22026789 Radiology Information System’s Mechanisms: HL7-MHS & HL7/DICOM Translation
Authors: Kulwinder Singh Mann
Abstract:
The innovative features of information system, known as Radiology Information System (RIS), for electronic medical records has shown a good impact in the hospital. The objective is to help and make their work easier; such as for a physician to access the patient’s data and for a patient to check their bill transparently. The interoperability of RIS with the other intra-hospital information systems it interacts with, dealing with the compatibility and open architecture issues, are accomplished by two novel mechanisms. The first one is the particular message handling system that is applied for the exchange of information, according to the Health Level Seven (HL7) protocol’s specifications and serves the transfer of medical and administrative data among the RIS applications and data store unit. The second one implements the translation of information between the formats that HL7 and Digital Imaging and Communication in Medicine (DICOM) protocols specify, providing the communication between RIS and Picture and Archive Communication System (PACS) which is used for the increasing incorporation of modern medical imaging equipment.Keywords: RIS, PACS, HIS, HL7, DICOM, messaging service, interoperability, digital images
Procedia PDF Downloads 30026788 Genetic Association of SIX6 Gene with Pathogenesis of Glaucoma
Authors: Riffat Iqbal, Sidra Ihsan, Andleeb Batool, Maryam Mukhtar
Abstract:
Glaucoma is a gathering of optic neuropathies described by dynamic degeneration of retinal ganglionic cells. It is clinically and innately heterogenous illness containing a couple of particular forms each with various causes and severities. Primary open-angle glaucoma (POAG) is the most generally perceived kind of glaucoma. This study investigated the genetic association of single nucleotide polymorphisms (SNPs; rs10483727 and rs33912345) at the SIX1/SIX6 locus with primary open-angle glaucoma (POAG) in the Pakistani population. The SIX6 gene plays an important role in ocular development and has been associated with morphology of the optic nerve. A total of 100 patients clinically diagnosed with glaucoma and 100 control individuals of age over 40 were enrolled in the study. Genomic DNA was extracted by organic extraction method. The SNP genotyping was done by (i) PCR based restriction fragment length polymorphism (RFLP) and sequencing method. Significant genetic associations were observed for rs10483727 (risk allele T) and rs33912345 (risk allele C) with POAG. Hence, it was concluded that Six6 gene is genetically associated with pathogenesis of Glaucoma in Pakistan.Keywords: genotyping, Pakistani population, primary open-angle glaucoma, SIX6 gene
Procedia PDF Downloads 18426787 Transferring Data from Glucometer to Mobile Device via Bluetooth with Arduino Technology
Authors: Tolga Hayit, Ucman Ergun, Ugur Fidan
Abstract:
Being healthy is undoubtedly an indispensable necessity for human life. With technological improvements, in the literature, various health monitoring and imaging systems have been developed to satisfy your health needs. In this context, the work of monitoring and recording the data of individual health monitoring data via wireless technology is also being part of these studies. Nowadays, mobile devices which are located in almost every house and which become indispensable of our life and have wireless technology infrastructure have an important place of making follow-up health everywhere and every time because these devices were using in the health monitoring systems. In this study, Arduino an open-source microcontroller card was used in which a sample sugar measuring device was connected in series. In this way, the glucose data (glucose ratio, time) obtained with the glucometer is transferred to the mobile device based on the Android operating system with the Bluetooth technology channel. A mobile application was developed using the Apache Cordova framework for listing data, presenting graphically and reading data over Arduino. Apache Cordova, HTML, Javascript and CSS are used in coding section. The data received from the glucometer is stored in the local database of the mobile device. It is intended that people can transfer their measurements to their mobile device by using wireless technology and access the graphical representations of their data. In this context, the aim of the study is to be able to perform health monitoring by using different wireless technologies in mobile devices that can respond to different wireless technologies at present. Thus, that will contribute the other works done in this area.Keywords: Arduino, Bluetooth, glucose measurement, mobile health monitoring
Procedia PDF Downloads 32226786 [Keynote Talk]: Knowledge Codification and Innovation Success within Digital Platforms
Authors: Wissal Ben Arfi, Lubica Hikkerova, Jean-Michel Sahut
Abstract:
This study examines interfirm networks in the digital transformation era, and in particular, how tacit knowledge codification affects innovation success within digital platforms. Hence, one of the most important features of digital transformation and innovation process outcomes is the emergence of digital platforms, as an interfirm network, at the heart of open innovation. This research aims to illuminate how digital platforms influence inter-organizational innovation through virtual team interactions and knowledge sharing practices within an interfirm network. Consequently, it contributes to the respective strategic management literature on new product development (NPD), open innovation, industrial management, and its emerging interfirm networks’ management. The empirical findings show, on the one hand, that knowledge conversion may be enhanced, especially by the socialization which seems to be the most important phase as it has played a crucial role to hold the virtual team members together. On the other hand, in the process of socialization, the tacit knowledge codification is crucial because it provides the structure needed for the interfirm network actors to interact and act to reach common goals which favor the emergence of open innovation. Finally, our results offer several conditions necessary, but not always sufficient, for interfirm managers involved in NPD and innovation concerning strategies to increasingly shape interconnected and borderless markets and business collaborations. In the digital transformation era, the need for adaptive and innovative business models as well as new and flexible network forms is becoming more significant than ever. Supported by technological advancements and digital platforms, companies could benefit from increased market opportunities and creating new markets for their innovations through alliances and collaborative strategies, as a mode of reducing or eliminating uncertainty environments or entry barriers. Consequently, an efficient and well-structured interfirm network is essential to create network capabilities, to ensure tacit knowledge sharing, to enhance organizational learning and to foster open innovation success within digital platforms.Keywords: interfirm networks, digital platform, virtual teams, open innovation, knowledge sharing
Procedia PDF Downloads 13026785 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics
Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood
Abstract:
We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka
Procedia PDF Downloads 39426784 A Meaning-Making Approach to Understand the Relationship between the Physical Built Environment of the Heritage Sites including the Intangible Values and the Design Development of the Public Open Spaces: Case Study Liverpool Pier Head
Authors: May Newisar, Richard Kingston, Philip Black
Abstract:
Heritage-led regeneration developments have been considered as one of the cornerstones of the economic and social revival of historic towns and cities in the UK. However, this approach has proved its deficiency within the development of Liverpool World Heritage site. This is due to the conflict between sustaining the tangible and intangible values as well as achieving the aimed economic developments. Accordingly, the development of such areas is influenced by a top-down approach which considers heritage as consumable experience and urban regeneration as the economic development for it. This neglects the heritage sites characteristics and values as well as the design criteria for public open spaces that overlap with the heritage sites. Currently, knowledge regarding the relationship between the physical built environment of the heritage sites including the intangible values and the design development of the public open spaces is limited. Public open spaces have been studied from different perspectives such as increasing walkability, a source of social cohesion, provide a good quality of life as well as understanding users’ perception. While heritage sites have been discussed heavily on how to maintain the physical environment, understanding the courses of threats and how to be protected. In addition to users’ experiences and motivations of visiting such areas. Furthermore, new approaches tried to overcome the gap such as the historic urban landscape approach. This approach is focusing on the entire human environment with all its tangible and intangible qualities. However, this research aims to understand the relationship between the heritage sites and public open spaces and how the overlap of the design and development of both could be used as a quality to enhance the heritage sites and improve users’ experience. A meaning-making approach will be used in order to understand and articulate how the development of Liverpool World Heritage site and its value could influence and shape the design of public open space Pier Head in order to attract a different level of tourists to be used as a tool for economic development. Consequently, this will help in bridging the gap between the planning and conservation areas’ policies through an understanding of how flexible is the system in order to adopt alternative approaches for the design and development strategies for those areas.Keywords: historic urban landscape, environmental psychology, urban governance, identity
Procedia PDF Downloads 13126783 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 35826782 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 8926781 Predicting Personality and Psychological Distress Using Natural Language Processing
Authors: Jihee Jang, Seowon Yoon, Gaeun Son, Minjung Kang, Joon Yeon Choeh, Kee-Hong Choi
Abstract:
Background: Self-report multiple choice questionnaires have been widely utilized to quantitatively measure one’s personality and psychological constructs. Despite several strengths (e.g., brevity and utility), self-report multiple-choice questionnaires have considerable limitations in nature. With the rise of machine learning (ML) and Natural language processing (NLP), researchers in the field of psychology are widely adopting NLP to assess psychological constructs to predict human behaviors. However, there is a lack of connections between the work being performed in computer science and that psychology due to small data sets and unvalidated modeling practices. Aims: The current article introduces the study method and procedure of phase II, which includes the interview questions for the five-factor model (FFM) of personality developed in phase I. This study aims to develop the interview (semi-structured) and open-ended questions for the FFM-based personality assessments, specifically designed with experts in the field of clinical and personality psychology (phase 1), and to collect the personality-related text data using the interview questions and self-report measures on personality and psychological distress (phase 2). The purpose of the study includes examining the relationship between natural language data obtained from the interview questions, measuring the FFM personality constructs, and psychological distress to demonstrate the validity of the natural language-based personality prediction. Methods: The phase I (pilot) study was conducted on fifty-nine native Korean adults to acquire the personality-related text data from the interview (semi-structured) and open-ended questions based on the FFM of personality. The interview questions were revised and finalized with the feedback from the external expert committee, consisting of personality and clinical psychologists. Based on the established interview questions, a total of 425 Korean adults were recruited using a convenience sampling method via an online survey. The text data collected from interviews were analyzed using natural language processing. The results of the online survey, including demographic data, depression, anxiety, and personality inventories, were analyzed together in the model to predict individuals’ FFM of personality and the level of psychological distress (phase 2).Keywords: personality prediction, psychological distress prediction, natural language processing, machine learning, the five-factor model of personality
Procedia PDF Downloads 7926780 Economics of Open and Distance Education in the University of Ibadan, Nigeria
Authors: Babatunde Kasim Oladele
Abstract:
One of the major objectives of the Nigeria national policy on education is the provision of equal educational opportunities to all citizens at different levels of education. With regards to higher education, an aspect of the policy encourages distance learning to be organized and delivered by tertiary institutions in Nigeria. This study therefore, determines how much of the Government resources are committed, how the resources are utilized and what alternative sources of funding are available for this system of education. This study investigated the trends in recurrent costs between 2004/2005 and 2013/2014 at University of Ibadan Distance Learning Centre (DLC). A descriptive survey research design was employed for the study. Questionnaire was the research instrument used for the collection of data. The population of the study was 280 current distance learning education students, 70 academic staff and 50 administrative staff. Only 354 questionnaires were correctly filled and returned. Data collected were analyzed and coded using the frequencies, ratio, average and percentages were used to answer all the research questions. The study revealed that staff salaries and allowances of academic and non-academic staff represent the most important variable that influences the cost of education. About 55% of resources were allocated to this sector alone. The study also indicates that costs rise every year with increase in enrolment representing a situation of diseconomies of scale. This study recommends that Universities who operates distance learning program should strive to explore other internally generated revenue option to boost their revenue. University of Ibadan, being the premier university in Nigeria, should be given foreign aid and home support, both financially and materially, to enable the institute to run a formidable distance education program that would measure up in planning and implementation with those of developed nation.Keywords: open education, distance education, University of Ibadan, Nigeria, cost of education
Procedia PDF Downloads 17826779 Reproductive Performance of Dairy Cows at Different Parities: A Case Study in Enrekang Regency, Indonesia
Authors: Muhammad Yusuf, Abdul Latief Toleng, Djoni Prawira Rahardja, Ambo Ako, Sahiruddin Sahiruddin, Abdi Eriansyah
Abstract:
The objective of this study was to know the reproductive performance of dairy cows at different parities. A total of 60 dairy Holstein-Friesian cows with parity one to three from five small farms raised by the farmers were used in the study. All cows were confined in tie stall barn with rubber on the concrete floor. The herds were visited twice for survey with the help of a questionnaire. Reproductive parameters used in the study were days open, calving interval, and service per conception (S/C). The results of this study showed that the mean (±SD) days open of the cows in parity 2 was slightly longer than those in parity 3 (228.2±121.5 vs. 205.5±144.5; P=0.061). None cows conceived within 85 days postpartum in parity 3 in comparison to 13.8% cows conceived in parity 2. However, total cows conceived within 150 days post partum in parity 2 and parity 3 were 30.1% and 36.4%, respectively. Likewise, after reaching 210 days after calving, number of cows conceived in parity 3 had higher than number of cows in parity 2 (72.8% vs. 44.8%; P<0.05). The mean (±SD) calving interval of the cows in parity 2 and parity 3 were 508.2±121.5 and 495.5±144.1, respectively. Number of cows with calving interval of 400 and 450 days in parity 3 was higher than those cows in parity 2 (23.1% vs. 17.2% and 53.9% vs. 31.0%). Cows in parity 1 had significantly (P<0.01) lower number of S/C in comparison to the cows with parity 2 and parity 3 (1.6±1.2 vs. 3.5±3.4 and 3.3±2.1). It can be concluded that reproductive performance of the cows is affected by different parities.Keywords: dairy cows, parity, days open, calving interval, service per conception
Procedia PDF Downloads 25726778 Hierarchical Filtering Method of Threat Alerts Based on Correlation Analysis
Authors: Xudong He, Jian Wang, Jiqiang Liu, Lei Han, Yang Yu, Shaohua Lv
Abstract:
Nowadays, the threats of the internet are enormous and increasing; however, the classification of huge alert messages generated in this environment is relatively monotonous. It affects the accuracy of the network situation assessment, and also brings inconvenience to the security managers to deal with the emergency. In order to deal with potential network threats effectively and provide more effective data to improve the network situation awareness. It is essential to build a hierarchical filtering method to prevent the threats. In this paper, it establishes a model for data monitoring, which can filter systematically from the original data to get the grade of threats and be stored for using again. Firstly, it filters the vulnerable resources, open ports of host devices and services. Then use the entropy theory to calculate the performance changes of the host devices at the time of the threat occurring and filter again. At last, sort the changes of the performance value at the time of threat occurring. Use the alerts and performance data collected in the real network environment to evaluate and analyze. The comparative experimental analysis shows that the threat filtering method can effectively filter the threat alerts effectively.Keywords: correlation analysis, hierarchical filtering, multisource data, network security
Procedia PDF Downloads 20126777 Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses
Authors: Nabil Sultan
Abstract:
A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).Keywords: MOOCs, disruptive innovations, higher education, jobs theory
Procedia PDF Downloads 27026776 Orthodontic Treatment Using CAD/CAM System
Authors: Cristiane C. B. Alves, Livia Eisler, Gustavo Mota, Kurt Faltin Jr., Cristina L. F. Ortolani
Abstract:
The correct positioning of the brackets is essential for the success of orthodontic treatment. Indirect bracket placing technique has the main objective of eliminating the positioning errors, which commonly occur in the technique of direct system of brackets. The objective of this study is to demonstrate that the exact positioning of the brackets is of extreme relevance for the success of the treatment. The present work shows a case report of an adult female patient who attended the clinic with the complaint of being in orthodontic treatment for more than 5 years without noticing any progress. As a result of the intra-oral clinical examination and documentation analysis, a class III malocclusion, an anterior open bite, and absence of all third molars and first upper and lower bilateral premolars were observed. For the treatment, the indirect bonding technique with self-ligating ceramic braces was applied. The preparation of the trays was done after the intraoral digital scanning and printing of models with a 3D printer. Brackets were positioned virtually, using a specialized software. After twelve months of treatment, correction of the malocclusion was observed, as well as the closing of the anterior open bite. It is concluded that the adequate and precise positioning of brackets is necessary for a successful treatment.Keywords: anterior open-bite, CAD/CAM, orthodontics, malocclusion, angle class III
Procedia PDF Downloads 19426775 A Review on 3D Smart City Platforms Using Remotely Sensed Data to Aid Simulation and Urban Analysis
Authors: Slim Namouchi, Bruno Vallet, Imed Riadh Farah
Abstract:
3D urban models provide powerful tools for decision making, urban planning, and smart city services. The accuracy of this 3D based systems is directly related to the quality of these models. Since manual large-scale modeling, such as cities or countries is highly time intensive and very expensive process, a fully automatic 3D building generation is needed. However, 3D modeling process result depends on the input data, the proprieties of the captured objects, and the required characteristics of the reconstructed 3D model. Nowadays, producing 3D real-world model is no longer a problem. Remotely sensed data had experienced a remarkable increase in the recent years, especially data acquired using unmanned aerial vehicles (UAV). While the scanning techniques are developing, the captured data amount and the resolution are getting bigger and more precise. This paper presents a literature review, which aims to identify different methods of automatic 3D buildings extractions either from LiDAR or the combination of LiDAR and satellite or aerial images. Then, we present open source technologies, and data models (e.g., CityGML, PostGIS, Cesiumjs) used to integrate these models in geospatial base layers for smart city services.Keywords: CityGML, LiDAR, remote sensing, SIG, Smart City, 3D urban modeling
Procedia PDF Downloads 13526774 The Determinants and Effects of R&D Outsourcing in Korean Manufacturing Firm
Authors: Sangyun Han, Minki Kim
Abstract:
R&D outsourcing is a strategy for acquiring the competitiveness of firms as an open innovation strategy. As increasing total R&D investment of firms, the ratio of amount of R&D outsourcing in it is also increased in Korea. In this paper, we investigate the determinants and effects of R&D outsourcing of firms. Through analyzing the determinants of R&D outsourcing and effect on firm’s performance, we can find some academic and politic issues. Firstly, in the point of academic view, distinguishing the determinants of R&D outsourcing is linked why the firms do open innovation. It can be answered resource based view, core competence theory, and etc. Secondly, we can get some S&T politic implication for transferring the public intellectual properties to private area. Especially, for supporting the more SMEs or ventures, government can get the basement and the reason why and how to make the policies.Keywords: determinants, effects, R&D, outsourcing
Procedia PDF Downloads 50626773 Vertical Accuracy Evaluation of Indian National DEM (CartoDEM v3) Using Dual Frequency GNSS Derived Ground Control Points for Lower Tapi Basin, Western India
Authors: Jaypalsinh B. Parmar, Pintu Nakrani, Ashish Chaurasia
Abstract:
Digital Elevation Model (DEM) is considered as an important data in GIS-based terrain analysis for many applications and assessment of processes such as environmental and climate change studies, hydrologic modelling, etc. Vertical accuracy of DEM having geographically dynamic nature depends on different parameters which affect the model simulation outcomes. Vertical accuracy assessment in Indian landscape especially in low-lying coastal urban terrain such as lower Tapi Basin is very limited. In the present study, attempt has been made to evaluate the vertical accuracy of 30m resolution open source Indian National Cartosat-1 DEM v3 for Lower Tapi Basin (LTB) from western India. The extensive field investigation is carried out using stratified random fast static DGPS survey in the entire study region, and 117 high accuracy ground control points (GCPs) have been obtained. The above open source DEM was compared with obtained GCPs, and different statistical attributes were envisaged, and vertical error histograms were also evaluated.Keywords: CartoDEM, Digital Elevation Model, GPS, lower Tapi basin
Procedia PDF Downloads 35826772 Factors Affecting Early Antibiotic Delivery in Open Tibial Shaft Fractures
Authors: William Elnemer, Nauman Hussain, Samir Al-Ali, Henry Shu, Diane Ghanem, Babar Shafiq
Abstract:
Introduction: The incidence of infection in open tibial shaft injuries varies depending on the severity of the injury, with rates ranging from 1.8% for Gustilo-Anderson type I to 42.9% for type IIIB fractures. The timely administration of antibiotics upon presentation to the emergency department (ED) is an essential component of fracture management, and evidence indicates that prompt delivery of antibiotics is associated with improved outcomes. The objective of this study is to identify factors that contribute to the expedient administration of antibiotics. Methods: This is a retrospective study of open tibial shaft fractures at an academic Level I trauma center. Current Procedural Terminology (CPT) codes identified all patients treated for open tibial shaft fractures between 2015 and 2021. Open fractures were identified by reviewing ED and provider notes, and with ballistic fractures were considered open. Chart reviews were performed to extract demographics, fracture characteristics, postoperative outcomes, time to operative room, time to antibiotic order, and delivery. Univariate statistical analysis compared patients who received early antibiotics (EA), which were delivered within one hour of ED presentation, and those who received late antibiotics (LA), which were delivered outside of one hour of ED presentation. A multivariate analysis was performed to investigate patient, fracture, and transport/ED characteristics contributing to faster delivery of antibiotics. The multivariate analysis included the dependent variables: ballistic fracture, activation of Delta Trauma, Gustilo-Andersen (Type III vs. Type I and II), AO-OTA Classification (Type C vs. Type A and B), arrival between 7 am and 11 pm, and arrival via Emergency Medical Services (EMS) or walk-in. Results: Seventy ED patients with open tibial shaft fractures were identified. Of these, 39 patients (55.7%) received EA, while 31 patients (44.3%) received LA. Univariate analysis shows that the arrival via EMS as opposed to walk-in (97.4% vs. 74.2%, respectively, p = 0.01) and activation of Delta Trauma (89.7% vs. 51.6%, respectively, p < 0.001) was significantly higher in the EA group vs. the LA group. Additionally, EA cases had significantly shorter intervals between the antibiotic order and delivery when compared to LA cases (0.02 hours vs. 0.35 hours, p = 0.007). No other significant differences were found in terms of postoperative outcomes or fracture characteristics. Multivariate analysis shows that a Delta Trauma Response, arrival via EMS, and presentation between 7 am and 11 pm were independent predictors of a shorter time to antibiotic administration (Odds Ratio = 11.9, 30.7, and 5.4, p = 0.001, 0.016, and 0.013, respectively). Discussion: Earlier antibiotic delivery is associated with arrival to the ED between 7 am and 11 pm, arrival via EMS, and a coordinated Delta Trauma activation. Our findings indicate that in cases where administering antibiotics is critical to achieving positive outcomes, it is advisable to employ a coordinated Delta Trauma response. Hospital personnel should be attentive to the rapid administration of antibiotics to patients with open fractures who arrive via walk-in or during late-night hours.Keywords: antibiotics, emergency department, fracture management, open tibial shaft fractures, orthopaedic surgery, time to or, trauma fractures
Procedia PDF Downloads 6526771 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria
Authors: Ibrahim Rabiu Darazo
Abstract:
The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs
Procedia PDF Downloads 33226770 Analyzing the Effects of Supply and Demand Shocks in the Spanish Economy
Authors: José M Martín-Moreno, Rafaela Pérez, Jesús Ruiz
Abstract:
In this paper we use a small open economy Dynamic Stochastic General Equilibrium Model (DSGE) for the Spanish economy to search for a deeper characterization of the determinants of Spain’s macroeconomic fluctuations throughout the period 1970-2008. In order to do this, we distinguish between tradable and non-tradable goods to take into account the fact that the presence of non-tradable goods in this economy is one of the largest in the world. We estimate a DSGE model with supply and demand shocks (sectorial productivity, public spending, international real interest rate and preferences) using Kalman Filter techniques. We find the following results. First of all, our variance decomposition analysis suggests that 1) the preference shock basically accounts for private consumption volatility, 2) the idiosyncratic productivity shock accounts for non-tradable output volatility, and 3) the sectorial productivity shock along with the international interest rate both greatly account for tradable output. Secondly, the model closely replicates the time path observed in the data for the Spanish economy and finally, the model captures the main cyclical qualitative features of this economy reasonably well.Keywords: business cycle, DSGE models, Kalman filter estimation, small open economy
Procedia PDF Downloads 41626769 Assessment of Heavy Metal Contamination in Soil and Groundwater Due to Leachate Migration from an Open Dumping Site
Authors: Kali Prasad Sarma
Abstract:
Indiscriminate disposal of municipal solid waste (MSW) in open dumping site is a common scenario in developing countries like India which poses a risk to the environment as well as human health. The objective of the present investigation was to find out the concentration of heavy metals (Pb, Cr, Ni, Mn, Zn, Cu, and Cd) and other physicochemical parameters of leachate and soil collected from an open dumping site of Tezpur town, Assam, India and its associated potential ecological risk. Tezpur is an urban agglomeration coming under the category of Class I UAs/Towns with a population of 105,377 as per data released by Government of India for Census 2011. Impact of the leachate on the groundwater was also addressed in our study. The concentrations of heavy metals were determined using ICP-OES. Energy dispersive X-Ray (SEM-EDS) microanalysis was also conducted to see the presence of the studied metals in the soil. X-Ray diffraction analysis (XRD) and Fourier Transform Infrared (FTIR) spectroscopy were also used to identify dominant minerals present in the soil samples. The trend of measured heavy metals in the soil samples was found in the following order: Mn > Pb > Cu > Zn > Cr > Ni > Cd. The assessment of heavy metal contamination in the soil was carried out by calculating enrichment factor (EF), geo-accumulation index (Igeo), contamination factor (Cfi), degree of contamination (Cd), pollution load index (PLI) and ecological risk factor (Eri). The study showed that the concentrations of Pb, Cu, and Cd were much higher than their respective average shale value and the EF of the soil samples depicted very severe enrichment for Pb, Cu, and Cd; moderate enrichment for Cr and Zn. Calculated Igeo values indicated that the soil is moderate to strongly contaminated with Pb and uncontaminated to moderately contaminated with Cd and Cu. The Cfi value for Pb indicates a very strong contamination level of the metal in the soil. The Cfi values for Cu and Cd were 2.37 and 1.65 respectively indicating moderate contamination level. To apportion the possible sources of heavy metal contamination in soil, principal components analysis (PCA) has been adopted. From the leachate, heavy metals are accumulated at the dumping site soil which could easily percolate through the soil and reach the groundwater. The possible relation of groundwater contamination due to leachate percolation was examined by analyzing the heavy metal concentrations in groundwater with respect to distance from the dumping site. The concentrations of Cd and Pb in groundwater (at a distance of 20m from dumping site) exceeded the permissible limit for drinking water as set by WHO. Occurrence of elevated concentration of potentially toxic heavy metals such as Pb and Cd in groundwater and soil are much environmental concern as it is detrimental to human health and ecosystem.Keywords: groundwater, heavy metal contamination, leachate, open dumping site
Procedia PDF Downloads 109