Search results for: applications of big data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29429

Search results for: applications of big data

25739 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 333
25738 Functionalized Nano porous Ceramic Membranes for Electrodialysis Treatment of Harsh Wastewater

Authors: Emily Rabe, Stephanie Candelaria, Rachel Malone, Olivia Lenz, Greg Newbloom

Abstract:

Electrodialysis (ED) is a well-developed technology for ion removal in a variety of applications. However, many industries generate harsh wastewater streams that are incompatible with traditional ion exchange membranes. Membrion® has developed novel ceramic-based ion exchange membranes (IEMs) offering several advantages over traditional polymer membranes: high performance in low pH, chemical resistance to oxidizers, and a rigid structure that minimizes swelling. These membranes are synthesized with our patented silane-based sol-gel techniques. The pore size, shape, and network structure are engineered through a molecular self-assembly process where thermodynamic driving forces are used to direct where and how pores form. Either cationic or anionic groups can be added within the membrane nanopore structure to create cation- and anion-exchange membranes. The ceramic IEMs are produced on a roll-to-roll manufacturing line with low-temperature processing. Membrane performance testing is conducted using in-house permselectivity, area-specific resistance, and ED stack testing setups. Ceramic-based IEMs show comparable performance to traditional IEMs and offer some unique advantages. Long exposure to highly acidic solutions has a negligible impact on ED performance. Additionally, we have observed stable performance in the presence of strong oxidizing agents such as hydrogen peroxide. This stability is expected, as the ceramic backbone of these materials is already in a fully oxidized state. This data suggests ceramic membranes, made using sol-gel chemistry, could be an ideal solution for acidic and/or oxidizing wastewater streams from processes such as semiconductor manufacturing and mining.

Keywords: ion exchange, membrane, silane chemistry, nanostructure, wastewater

Procedia PDF Downloads 81
25737 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 354
25736 Hydrometallurgical Processing of a Nigerian Chalcopyrite Ore

Authors: Alafara A. Baba, Kuranga I. Ayinla, Folahan A. Adekola, Rafiu B. Bale

Abstract:

Due to increasing demands and diverse applications of copper oxide as pigment in ceramics, cuprammonium hydroxide solution for rayon, p-type semi-conductor, dry cell batteries production and as safety disposal of hazardous materials, a study on the hydrometallurgical operations involving leaching, solvent extraction and precipitation for the recovery of copper for producing high grade copper oxide from a Nigerian chalcopyrite ore in chloride media has been examined. At a particular set of experimental parameter with respect to acid concentration, reaction temperature and particle size, the leaching investigation showed that the ore dissolution increases with increasing acid concentration, temperature and decreasing particle diameter at a moderate stirring. The kinetics data has been analyzed and was found to follow diffusion control mechanism. At optimal conditions, the extent of ore dissolution reached 94.3%. The recovery of the total copper from the hydrochloric acid-leached chalcopyrite ore was undertaken by solvent extraction and precipitation techniques, prior to the beneficiation of the purified solution as copper oxide. The purification of the leach liquor was firstly done by precipitation of total iron and manganese using Ca(OH)2 and H2O2 as oxidizer at pH 3.5 and 4.25, respectively. An extraction efficiency of 97.3% total copper was obtained by 0.2 mol/L Dithizone in kerosene at 25±2ºC within 40 minutes, from which ≈98% Cu from loaded organic phase was successfully stripped by 0.1 mol/L HCl solution. The beneficiation of the recovered pure copper solution was carried out by crystallization through alkali addition followed by calcination at 600ºC to obtain high grade copper oxide (Tenorite, CuO: 05-0661). Finally, a simple hydrometallurgical scheme for the operational extraction procedure amenable for industrial utilization and economic sustainability was provided.

Keywords: chalcopyrite ore, Nigeria, copper, copper oxide, solvent extraction

Procedia PDF Downloads 385
25735 Site Specific Nutrient Management Need in India Now

Authors: A. H. Nanher, N. P. Singh, Shashidhar Yadav, Sachin Tyagi

Abstract:

Agricultural production system is an outcome of a complex interaction of seed, soil, water and agro-chemicals (including fertilizers). Therefore, judicious management of all the inputs is essential for the sustainability of such a complex system. Precision agriculture gives farmers the ability to use crop inputs more effectively including fertilizers, pesticides, tillage and irrigation water. More effective use of inputs means greater crop yield and/or quality, without polluting the environment the focus on enhancing the productivity during the Green Revolution coupled with total disregard of proper management of inputs and without considering the ecological impacts, has resulted into environmental degradation. To evaluate a new approach for site-specific nutrient management (SSNM). Large variation in initial soil fertility characteristics and indigenous supply of N, P, and K was observed among Field- and season-specific NPK applications were calculated by accounting for the indigenous nutrient supply, yield targets, and nutrient demand as a function of the interactions between N, P, and K. Nitrogen applications were fine-tuned based on season-specific rules and field-specific monitoring of crop N status. The performance of SSNM did not differ significantly between high-yielding and low-yielding climatic seasons, but improved over time with larger benefits observed in the second year Future, strategies for nutrient management in intensive rice systems must become more site-specific and dynamic to manage spatially and temporally variable resources based on a quantitative understanding of the congruence between nutrient supply and crop demand. The SSNM concept has demonstrated promising agronomic and economic potential. It can be used for managing plant nutrients at any scale, i.e., ranging from a general recommendation for homogenous management of a larger domain to true management of between-field variability. Assessment of pest profiles in FFP and SSNM plots suggests that SSNM may also reduce pest incidence, particularly diseases that are often associated with excessive N use or unbalanced plant nutrition.

Keywords: nutrient, pesticide, crop, yield

Procedia PDF Downloads 422
25734 Estimation of Source Parameters Using Source Parameters Imaging Method From Digitised High Resolution Airborne Magnetic Data of a Basement Complex

Authors: O. T. Oluriz, O. D. Akinyemi, J. A.Olowofela, O. A. Idowu, S. A. Ganiyu

Abstract:

This study was carried out using aeromagnetic data which record variation in the magnitude of the earth magnetic field in order to detect local changes in the properties of the underlying geology. The aeromagnetic data (Sheet No. 261) was acquired from the archives of Nigeria Geological Survey Agency of Nigeria, obtained in 2009. The study present estimation of source parameters within an area of about 3,025 square kilometers on geographic latitude to and longitude to within Ibadan and it’s environs in Oyo State, southwestern Nigeria. The area under study belongs to part of basement complex in southwestern Nigeria. Estimation of source parameters of aeromagnetic data was achieve through the application of source imaging parameters (SPI) techniques that provide delineation, depth, dip contact, susceptibility contrast and mineral potentials of magnetic signatures within the region. The depth to the magnetic sources in the area ranges from 0.675 km to 4.48 km. The estimated depth limit to shallow sources is 0.695 km and depth to deep sources is 4.48 km. The apparent susceptibility values of the entire study area obtained ranges from 0.01 to 0.005 [SI]. This study has shown that the magnetic susceptibility within study area is controlled mainly by super paramagnetic minerals.

Keywords: aeromagnetic, basement complex, meta-sediment, precambrian

Procedia PDF Downloads 426
25733 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 463
25732 An Investigation of Differential Item and Test Functioning of Scholastic Aptitude Test 2011 (SWUSAT 2011)

Authors: Ruangdech Sirikit

Abstract:

The purposes of this study were analyzed differential item functioning and differential test functioning of SWUSAT aptitude test classification by sex variable. The data used in this research is the secondary data from Srinakharinwirot University Scholastic Aptitude Test 2011 (SWUSAT 2011) SWUSAT test consists of four subjects. There are verbal ability test, number ability test, reasoning ability test and spatial ability test. The data analysis was carried out in 2 steps. The first step was analyzing descriptive statistics. In the second step were analyzed differential item functioning (DIF) and differential test functioning (DTF) by using the DIFAS program. The research results were as follows: The results of data analysis for all 10 tests in year 2011. Sex was the characteristic that found DIF all 10 tests. The percentage of item number that found DIF was between 10% - 46.67%. There are 4 tests that most of items favors female group. There are 3 tests that most of items favors male group and there are 3 tests that the number of items favors female group equal favors male group. For Differential test functioning (DTF), there are 8 tests that have small DIF effect variance.

Keywords: differential item functioning, differential test functioning, SWUSAT, aptitude test

Procedia PDF Downloads 605
25731 Privacy Preservation Concerns and Information Disclosure on Social Networks: An Ongoing Research

Authors: Aria Teimourzadeh, Marc Favier, Samaneh Kakavand

Abstract:

The emergence of social networks has revolutionized the exchange of information. Every behavior on these platforms contributes to the generation of data known as social network data that are processed, stored and published by the social network service providers. Hence, it is vital to investigate the role of these platforms in user data by considering the privacy measures, especially when we observe the increased number of individuals and organizations engaging with the current virtual platforms without being aware that the data related to their positioning, connections and behavior is uncovered and used by third parties. Performing analytics on social network datasets may result in the disclosure of confidential information about the individuals or organizations which are the members of these virtual environments. Analyzing separate datasets can reveal private information about relationships, interests and more, especially when the datasets are analyzed jointly. Intentional breaches of privacy is the result of such analysis. Addressing these privacy concerns requires an understanding of the nature of data being accumulated and relevant data privacy regulations, as well as motivations for disclosure of personal information on social network platforms. Some significant points about how user's online information is controlled by the influence of social factors and to what extent the users are concerned about future use of their personal information by the organizations, are highlighted in this paper. Firstly, this research presents a short literature review about the structure of a network and concept of privacy in Online Social Networks. Secondly, the factors of user behavior related to privacy protection and self-disclosure on these virtual communities are presented. In other words, we seek to demonstrates the impact of identified variables on user information disclosure that could be taken into account to explain the privacy preservation of individuals on social networking platforms. Thirdly, a few research directions are discussed to address this topic for new researchers.

Keywords: information disclosure, privacy measures, privacy preservation, social network analysis, user experience

Procedia PDF Downloads 276
25730 Characteristic Function in Estimation of Probability Distribution Moments

Authors: Vladimir S. Timofeev

Abstract:

In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.

Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation

Procedia PDF Downloads 500
25729 The Current Status of Middle Class Internet Use in China: An Analysis Based on the Chinese General Social Survey 2015 Data and Semi-Structured Investigation

Authors: Abigail Qian Zhou

Abstract:

In today's China, the well-educated middle class, with stable jobs and above-average income, are the driving force behind its Internet society. Through the analysis of data from the 2015 Chinese General Social Survey and 50 interviewees, this study investigates the current situation of this group’s specific internet usage. The findings of this study demonstrate that daily life among the members of this socioeconomic group is closely tied to the Internet. For Chinese middle class, the Internet is used to socialize and entertain self and others. It is also used to search for and share information as well as to build their identities. The empirical results of this study will provide a reference, supported by factual data, for enterprises seeking to target the Chinese middle class through online marketing efforts.

Keywords: middle class, Internet use, network behaviour, online marketing, China

Procedia PDF Downloads 111
25728 Nowcasting Indonesian Economy

Authors: Ferry Kurniawan

Abstract:

In this paper, we nowcast quarterly output growth in Indonesia by exploiting higher frequency data (monthly indicators) using a mixed-frequency factor model and exploiting both quarterly and monthly data. Nowcasting quarterly GDP in Indonesia is particularly relevant for the central bank of Indonesia which set the policy rate in the monthly Board of Governors Meeting; whereby one of the important step is the assessment of the current state of the economy. Thus, having an accurate and up-to-date quarterly GDP nowcast every time new monthly information becomes available would clearly be of interest for central bank of Indonesia, for example, as the initial assessment of the current state of the economy -including nowcast- will be used as input for longer term forecast. We consider a small scale mixed-frequency factor model to produce nowcasts. In particular, we specify variables as year-on-year growth rates thus the relation between quarterly and monthly data is expressed in year-on-year growth rates. To assess the performance of the model, we compare the nowcasts with two other approaches: autoregressive model –which is often difficult when forecasting output growth- and Mixed Data Sampling (MIDAS) regression. In particular, both mixed frequency factor model and MIDAS nowcasts are produced by exploiting the same set of monthly indicators. Hence, we compare the nowcasts performance of the two approaches directly. To preview the results, we find that by exploiting monthly indicators using mixed-frequency factor model and MIDAS regression we improve the nowcast accuracy over a benchmark simple autoregressive model that uses only quarterly frequency data. However, it is not clear whether the MIDAS or mixed-frequency factor model is better. Neither set of nowcasts encompasses the other; suggesting that both nowcasts are valuable in nowcasting GDP but neither is sufficient. By combining the two individual nowcasts, we find that the nowcast combination not only increases the accuracy - relative to individual nowcasts- but also lowers the risk of the worst performance of the individual nowcasts.

Keywords: nowcasting, mixed-frequency data, factor model, nowcasts combination

Procedia PDF Downloads 327
25727 A Secure System for Handling Information from Heterogeous Sources

Authors: Shoohira Aftab, Hammad Afzal

Abstract:

Information integration is a well known procedure to provide consolidated view on sets of heterogeneous information sources. It not only provides better statistical analysis of information but also facilitates users to query without any knowledge on the underlying heterogeneous information sources The problem of providing a consolidated view of information can be handled using Semantic data (information stored in such a way that is understandable by machines and integrate-able without manual human intervention). However, integrating information using semantic web technology without any access management enforced, will results in increase of privacy and confidentiality concerns. In this research we have designed and developed a framework that would allow information from heterogeneous formats to be consolidated, thus resolving the issue of interoperability. We have also devised an access control system for defining explicit privacy constraints. We designed and applied our framework on both semantic and non-semantic data from heterogeneous resources. Our approach is validated using scenario based testing.

Keywords: information integration, semantic data, interoperability, security, access control system

Procedia PDF Downloads 345
25726 The Role of Financial Literacy in Driving Consumer Well-Being

Authors: Amin Nazifi, Amir Raki, Doga Istanbulluoglu

Abstract:

The incorporation of technological advancements into financial services, commonly referred to as Fintech, is primarily aimed at promoting services that are accessible, convenient, and inclusive, thereby benefiting both consumers and businesses. Fintech services employ a variety of technologies, including Artificial Intelligence (AI), blockchain, and big data, to enhance the efficiency and productivity of traditional services. Cryptocurrency, a component of Fintech, is projected to be a trillion-dollar industry, with over 320 million consumers globally investing in various forms of cryptocurrencies. However, these potentially transformative services can also lead to adverse outcomes. For instance, recent Fintech innovations have been increasingly linked to misconduct and disservice, resulting in serious implications for consumer well-being. This could be attributed to the ease of access to Fintech, which enables adults to trade cryptocurrencies, shares, and stocks via mobile applications. However, there is little known about the darker aspects of technological advancements, such as Fintech. Hence, this study aims to generate scholarly insights into the design of robust and resilient Fintech services that can add value to businesses and enhance consumer well-being. Using a mixed-method approach, the study will investigate the personal and contextual factors influencing consumers’ adoption and usage of technology innovations and their impacts on consumer well-being. First, semi-structured interviews will be conducted with a sample of Fintech users until theoretical saturation is achieved. Subsequently, based on the findings of the first study, a quantitative study will be conducted to develop and empirically test the impacts of these factors on consumers’ well-being using an online survey with a sample of 300 participants experienced in using Fintech services. This study will contribute to the growing Transformative Service Research (TSR) literature by addressing the latest priorities in service research and shedding light on the impact of fintech services on consumer well-being.

Keywords: consumer well-being, financial literacy, Fintech, service innovation

Procedia PDF Downloads 61
25725 Refractive Index, Excess Molar Volume and Viscometric Study of Binary Liquid Mixture of Morpholine with Cumene at 298.15 K, 303.15 K, and 308.15 K

Authors: B. K. Gill, Himani Sharma, V. K. Rattan

Abstract:

Experimental data of refractive index, excess molar volume and viscosity of binary mixture of morpholine with cumene over the whole composition range at 298.15 K, 303.15 K, 308.15 K and normal atmospheric pressure have been measured. The experimental data were used to compute the density, deviation in molar refraction, deviation in viscosity and excess Gibbs free energy of activation as a function of composition. The experimental viscosity data have been correlated with empirical equations like Grunberg- Nissan, Herric correlation and three body McAllister’s equation. The excess thermodynamic properties were fitted to Redlich-Kister polynomial equation. The variation of these properties with composition and temperature of the binary mixtures are discussed in terms of intermolecular interactions.

Keywords: cumene, excess Gibbs free energy, excess molar volume, morpholine

Procedia PDF Downloads 324
25724 Anthropometric Data Variation within Gari-Frying Population

Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe

Abstract:

The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.

Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design

Procedia PDF Downloads 249
25723 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 129
25722 A Survey of Digital Health Companies: Opportunities and Business Model Challenges

Authors: Iris Xiaohong Quan

Abstract:

The global digital health market reached 175 billion U.S. dollars in 2019, and is expected to grow at about 25% CAGR to over 650 billion USD by 2025. Different terms such as digital health, e-health, mHealth, telehealth have been used in the field, which can sometimes cause confusion. The term digital health was originally introduced to refer specifically to the use of interactive media, tools, platforms, applications, and solutions that are connected to the Internet to address health concerns of providers as well as consumers. While mHealth emphasizes the use of mobile phones in healthcare, telehealth means using technology to remotely deliver clinical health services to patients. According to FDA, “the broad scope of digital health includes categories such as mobile health (mHealth), health information technology (IT), wearable devices, telehealth and telemedicine, and personalized medicine.” Some researchers believe that digital health is nothing else but the cultural transformation healthcare has been going through in the 21st century because of digital health technologies that provide data to both patients and medical professionals. As digital health is burgeoning, but research in the area is still inadequate, our paper aims to clear the definition confusion and provide an overall picture of digital health companies. We further investigate how business models are designed and differentiated in the emerging digital health sector. Both quantitative and qualitative methods are adopted in the research. For the quantitative analysis, our research data came from two databases Crunchbase and CBInsights, which are well-recognized information sources for researchers, entrepreneurs, managers, and investors. We searched a few keywords in the Crunchbase database based on companies’ self-description: digital health, e-health, and telehealth. A search of “digital health” returned 941 unique results, “e-health” returned 167 companies, while “telehealth” 427. We also searched the CBInsights database for similar information. After merging and removing duplicate ones and cleaning up the database, we came up with a list of 1464 companies as digital health companies. A qualitative method will be used to complement the quantitative analysis. We will do an in-depth case analysis of three successful unicorn digital health companies to understand how business models evolve and discuss the challenges faced in this sector. Our research returned some interesting findings. For instance, we found that 86% of the digital health startups were founded in the recent decade since 2010. 75% of the digital health companies have less than 50 employees, and almost 50% with less than 10 employees. This shows that digital health companies are relatively young and small in scale. On the business model analysis, while traditional healthcare businesses emphasize the so-called “3P”—patient, physicians, and payer, digital health companies extend to “5p” by adding patents, which is the result of technology requirements (such as the development of artificial intelligence models), and platform, which is an effective value creation approach to bring the stakeholders together. Our case analysis will detail the 5p framework and contribute to the extant knowledge on business models in the healthcare industry.

Keywords: digital health, business models, entrepreneurship opportunities, healthcare

Procedia PDF Downloads 178
25721 Application of Flory Paterson’s Theory on the Volumetric Properties of Liquid Mixtures: 1,2-Dichloroethane with Aliphatic and Cyclic Ethers

Authors: Linda Boussaid, Farid Brahim Belaribi

Abstract:

The physico-chemical properties of liquid materials in the industrial field, in general, and in that of the chemical industries, in particular, constitutes a prerequisite for the design of equipment, for the resolution of specific problems (related to the techniques of purification and separation, at risk in the transport of certain materials, etc.) and, therefore, at the production stage. Chloroalkanes, ethers constitute three chemical families having an industrial, theoretical and environmental interest. For example, these compounds are used in various applications in the chemical and pharmaceutical industries. In addition, they contribute to the particular thermodynamic behavior (deviation from ideality, association, etc.) of certain mixtures which constitute a severe test for predictive theoretical models. Finally, due to the degradation of the environment in the world, a renewed interest is observed for ethers, because some of their physicochemical properties could contribute to lower pollution (ethers would be used as additives in aqueous fuels.). This work is a thermodynamic, experimental and theoretical study of the volumetric properties of liquid binary systems formed from compounds belonging to the chemical families of chloroalkanes, ethers, having an industrial, theoretical and environmental interest. Experimental determination of the densities and excess volumes of the systems studied, at different temperatures in the interval [278.15-333.15] K and at atmospheric pressure, using an AntonPaar vibrating tube densitometer of the DMA5000 type. This contribution of experimental data, on the volumetric properties of the binary liquid mixtures of 1,2-dichloroethane with an ether, supplemented by an application of the theoretical model of Prigogine-Flory-Patterson PFP, will probably contribute to the enrichment of the thermodynamic database and the further development of the theory of Flory in its Prigogine-Flory-Patterson (PFP) version, for a better understanding of the thermodynamic behavior of these liquid binary mixtures

Keywords: prigogine-flory-patterson (pfp), propriétés volumétrique , volume d’excés, ethers

Procedia PDF Downloads 86
25720 Simplified INS\GPS Integration Algorithm in Land Vehicle Navigation

Authors: Othman Maklouf, Abdunnaser Tresh

Abstract:

Land vehicle navigation is subject of great interest today. Global Positioning System (GPS) is the main navigation system for positioning in such systems. GPS alone is incapable of providing continuous and reliable positioning, because of its inherent dependency on external electromagnetic signals. Inertial Navigation (INS) is the implementation of inertial sensors to determine the position and orientation of a vehicle. The availability of low-cost Micro-Electro-Mechanical-System (MEMS) inertial sensors is now making it feasible to develop INS using an inertial measurement unit (IMU). INS has unbounded error growth since the error accumulates at each step. Usually, GPS and INS are integrated with a loosely coupled scheme. With the development of low-cost, MEMS inertial sensors and GPS technology, integrated INS/GPS systems are beginning to meet the growing demands of lower cost, smaller size, and seamless navigation solutions for land vehicles. Although MEMS inertial sensors are very inexpensive compared to conventional sensors, their cost (especially MEMS gyros) is still not acceptable for many low-end civilian applications (for example, commercial car navigation or personal location systems). An efficient way to reduce the expense of these systems is to reduce the number of gyros and accelerometers, therefore, to use a partial IMU (ParIMU) configuration. For land vehicular use, the most important gyroscope is the vertical gyro that senses the heading of the vehicle and two horizontal accelerometers for determining the velocity of the vehicle. This paper presents a field experiment for a low-cost strap down (ParIMU)\GPS combination, with data post processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach, we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of our low-cost IMU (Inertial Measurement Unit) and because of the relatively small area of the trajectory.

Keywords: GPS, IMU, Kalman filter, materials engineering

Procedia PDF Downloads 412
25719 Assessment of Mortgage Applications Using Fuzzy Logic

Authors: Swathi Sampath, V. Kalaichelvi

Abstract:

The assessment of the risk posed by a borrower to a lender is one of the common problems that financial institutions have to deal with. Consumers vying for a mortgage are generally compared to each other by the use of a number called the Credit Score, which is generated by applying a mathematical algorithm to information in the applicant’s credit report. The higher the credit score, the lower the risk posed by the candidate, and the better he is to be taken on by the lender. The objective of the present work is to use fuzzy logic and linguistic rules to create a model that generates Credit Scores.

Keywords: credit scoring, fuzzy logic, mortgage, risk assessment

Procedia PDF Downloads 401
25718 Cloud Computing Impact on e-Government Adoption

Authors: Ali Elshabrawy

Abstract:

Cloud computing is expected to be important for e Government in near future. Governments need it for solving some of its e Government, financial, infrastructure, legacy systems and integration problems. It reduces information technology (IT) infrastructure needs and support costs, and offers on-demand infrastructure and computational power, improved collaboration capabilities, which are important for e Government projects start up and sustainability. Budget pressures will continue to drive more and more government IT to hybrid and even public clouds, and more cooperation between cloud service providers and governmental agencies are expected, Or developing governmental private, community clouds. Motivation to convince governments to use cloud computing services, will create a pressure on cloud service providers to cope with government's requirements for interoperability, security standards, open data and integration between their cloud systems There will be significant legal action arising out of governmental uses of cloud computing, and legislation addressing both IT and business needs and consumer fears and protections. Cloud computing is a considered a revolution for IT and E business in general and e commerce, e Government in particular. As governments faces increasing challenges regarding IT infrastructure required for e Government projects implementation. As a result of Lack of required financial resources allocated for e Government projects in developed and developing countries. Cloud computing can play a major role to solve some of e Government projects challenges such as, lack of financial resources, IT infrastructure, Human resources trained to manage e Government applications, interoperability, cost efficiency challenges. If we could solve some security issues related to cloud computing usage which considered critical for e Government projects. Pretty sure it’s Just a matter of time before cloud service providers will find out solutions to attract governments as major customers for their business.

Keywords: cloud computing, e-government, adoption, supply side barriers, e-government requirements, challenges

Procedia PDF Downloads 343
25717 The Development of Research Based Model to Enhance Critical Thinking, Cognitive Skills and Culture and Local Wisdom Knowledge of Undergraduate Students

Authors: Nithipattara Balsiri

Abstract:

The purposes of this research was to develop instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge of undergraduate students. The sample consisted of 307 undergraduate students. Critical thinking and cognitive skills test were employed for data collection. Second-order confirmatory factor analysis, t-test, and one-way analysis of variance were employed for data analysis using SPSS and LISREL programs. The major research results were as follows; 1) the instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge should be consists of 6 sequential steps, namely (1) the setting research problem (2) the setting research hypothesis (3) the data collection (4) the data analysis (5) the research result conclusion (6) the application for problem solving, and 2) after the treatment undergraduate students possessed a higher scores in critical thinking and cognitive skills than before treatment at the 0.05 level of significance.

Keywords: critical thinking, cognitive skills, culture and local wisdom knowledge

Procedia PDF Downloads 359
25716 A Case Study of Control of Blast-Induced Ground Vibration on Adjacent Structures

Authors: H. Mahdavinezhad, M. Labbaf, H. R. Tavakoli

Abstract:

In recent decades, the study and control of the destructive effects of explosive vibration in construction projects has received more attention, and several experimental equations in the field of vibration prediction as well as allowable vibration limit for various structures are presented. Researchers have developed a number of experimental equations to estimate the peak particle velocity (PPV), in which the experimental constants must be obtained at the site of the explosion by fitting the data from experimental explosions. In this study, the most important of these equations was evaluated for strong massive conglomerates around Dez Dam by collecting data on explosions, including 30 particle velocities, 27 displacements, 27 vibration frequencies and 27 acceleration of earth vibration at different distances; they were recorded in the form of two types of detonation systems, NUNEL and electric. Analysis showed that the data from the explosion had the best correlation with the cube root of the explosive, R2=0.8636, but overall the correlation coefficients are not much different. To estimate the vibration in this project, data regression was performed in the other formats, which resulted in the presentation of new equation with R2=0.904 correlation coefficient. Finally according to the importance of the studied structures in order to ensure maximum non damage to adjacent structures for each diagram, a range of application was defined so that for distances 0 to 70 meters from blast site, exponent n=0.33 and for distances more than 70 m, n =0.66 was suggested.

Keywords: blasting, blast-induced vibration, empirical equations, PPV, tunnel

Procedia PDF Downloads 123
25715 Thermoluminescence Study of Cu Doped Lithium Tetra Borate Samples Synthesized by Water/Solution Assisted Method

Authors: Swarnapriya Thiyagarajan, Modesto Antonio Sosa Aquino, Miguel Vallejo Hernandez, Senthilkumar Kalaiselvan Dhivyaraj, Jayaramakrishnan Velusamy

Abstract:

In this paper the lithium tetra borate (Li2B4O7) was prepared by used water/solution assisted synthesis method. Once finished the synthesization, Copper (Cu) were used to doping material with Li2B4O7 in order to enhance its thermo luminescent properties. The heating temperature parameters were 750°C for 2 hr and 150°C for 2hr. The samples produced by water assisted method were doped at different doping percentage (0.02%, 0.04%, 0.06%, 0.08%, 0.12%, 0.5%, 0.1%, and 1%) of Cu.The characteristics and identification of Li2B4O7 (undoped and doped) were determined in four tests. They are X-ray diffraction (XRD), Scanning electron microscope (SEM), Photoluminescence (PL), Ultra violet visible spectroscopy (UV Vis). As it is evidence from the XRD and SEM results the obtained Li2B4O7 and Li2B4O7 doping with Cu was confirmed and also confirmed the chemical compositition and their morphologies. The obtained lithium tetraborate XRD pattern result was verified with the reference data of lithium tetraborate with tetragonal structure from JCPDS. The glow curves of Li2B4O7 and Li2B4O7 : Cu were obtained by thermo luminescence (TLD) reader (Harshaw 3500). The pellets were irradiated with different kind of dose (58mGy, 100mGy, 500mGy, and 945mGy) by using an X-ray source. Finally this energy response was also compared with TLD100. The order of kinetics (b), frequency factor (S) and activation energy (E) or the trapping parameters were calculated using peak shape method. Especially Li2B4O7: Cu (0.1%) presents good glow curve in all kind of doses. The experimental results showed that this Li2B4O7: Cu could have good potential applications in radiation dosimetry. The main purpose of this paper is to determine the effect of synthesis on the TL properties of doped lithium tetra borate Li2B4O7.

Keywords: dosimetry, irradiation, lithium tetraborate, thermoluminescence

Procedia PDF Downloads 269
25714 Engaging Students in Learning through Visual Demonstration Models in Engineering Education

Authors: Afsha Shaikh, Mohammed Azizur Rahman, Ibrahim Hassan, Mayur Pal

Abstract:

Student engagement in learning is instantly affected by the sources of learning methods available for them, such as videos showing the applications of the concept or showing a practical demonstration. Specific to the engineering discipline, there exist enormous challenging concepts that can be simplified when they are connected to real-world scenarios. For this study, the concept of heat exchangers was used as it is a part of multidisciplinary engineering fields. To make the learning experience enjoyable and impactful, 3-D printed heat exchanger models were created for students to use while working on in-class activities and assignments. Students were encouraged to use the 3-D printed heat exchanger models to enhance their understanding of theoretical concepts associated with its applications. To assess the effectiveness of the method, feedback was received by students pursuing undergraduate engineering via an anonymous electronic survey. To make the feedback more realistic, unbiased, and genuine, students spent nearly two to three weeks using the models in their in-class assignments. The impact of these tools on their learning was assessed through their performance in their ungraded assignments as well as their interactive discussions with peers. ‘Having to apply the theory learned in class whilst discussing with peers on a class assignment creates a relaxed and stress-free learning environment in classrooms’; this feedback was received by more than half the students who took the survey and found 3-D models of heat exchanger very easy to use. Amongst many ways to enhance learning and make students more engaged through interactive models, this study sheds light on the importance of physical tools that help create a lasting mental representation in the minds of students. Moreover, in this technologically enhanced era, the concept of augmented reality was considered in this research. E-drawings application was recommended to enhance the vision of engineering students so they can see multiple views of the detailed 3-D models and cut through its different sides and angles to visualize it properly. E-drawings could be the next tool to implement in classrooms to enhance students’ understanding of engineering concepts.

Keywords: student engagement, life-long-learning, visual demonstration, 3-D printed models, engineering education

Procedia PDF Downloads 109
25713 Development of a System for Fitting Clothes and Accessories Using Augmented Reality

Authors: Dinmukhamed T., Vassiliy S.

Abstract:

This article suggests the idea of fitting clothes and accessories based on augmented reality. A logical data model has been developed, taking into account the decision-making module (colors, style, type, material, popularity, etc.) based on personal data (age, gender, weight, height, leg size, hoist length, geolocation, photogrammetry, number of purchases of certain types of clothing, etc.) and statistical data of the purchase history (number of items, price, size, color, style, etc.). Also, in order to provide information to the user, it is planned to develop an augmented reality system using a QR code. This system of selection and fitting of clothing and accessories based on augmented reality will be used in stores to reduce the time for the buyer to make a decision on the choice of clothes.

Keywords: augmented reality, online store, decision-making module, like QR code, clothing store, queue

Procedia PDF Downloads 152
25712 Study on the Effect of Coupling Fluid Compressible-Deformable Wall on the Flow of Molten Polymers

Authors: Mohamed Driouich, Kamal Gueraoui, Mohamed Sammouda

Abstract:

The main objective of this work is to establish a numerical code for studying the flow of molten polymers in deformable pipes. Using an iterative numerical method based on finite differences, we determine the profiles of the fluid velocity, the temperature and the apparent viscosity of the fluid. The numerical code presented can also be applied to other industrial applications.

Keywords: numerical code, molten polymers, deformable pipes, finite differences

Procedia PDF Downloads 566
25711 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 54
25710 Regeneration Study on the Athens City Center: Transformation of the Historical Triangle to “Low Pollution and Restricted Vehicle Traffic Zone”

Authors: Chondrogianni Dimitra, Yorgos J. Stephanedes

Abstract:

The impact of the economic crisis, coupled with the aging of the city's old core, is reflected in central Athens. Public and private users, residents, employees, visitors desire the quality upgrading of abandoned buildings and public spaces through environmental upgrading and sustainable mobility, and promotion of the international metropolitan character of the city. In the study, a strategy for reshaping the character and function of the historic Athenian triangle is proposed, aiming at its economic, environmental, and social sustainable development through feasible, meaningful, and non-landscaping solutions of low cost and high positive impact. Sustainable mobility is the main principle in re-planning the study area and transforming it into a “Low Pollution and Limited Vehicle Traffic Zone” is the main strategy. Τhe proposed measures include the development of pedestrian mobility networks by expanding the pedestrian roads and limited-traffic routes, of bicycle networks based on the approved Metropolitan Bicycle Route of Athens, of public transportation networks with new lines of electric mini-buses, and of new regulations for vehicle mobility in the historic triangle. In addition, complementary actions are proposed regarding the provision of Wi-Fi on fixed track media, development of applications that facilitate combined travel and provide real-time data, integration of micromobility (roller skates, Segway, Hoverboard), and its enhancement as a flexible means of personal mobility, and development of car-sharing, ride-sharing and dynamic carpooling initiatives.

Keywords: regeneration plans, sustainable mobility, environmental upgrading, athens historical triangle

Procedia PDF Downloads 155