Search results for: Atomic data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25151

Search results for: Atomic data

23741 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 58
23740 Prediction of Anticancer Potential of Curcumin Nanoparticles by Means of Quasi-Qsar Analysis Using Monte Carlo Method

Authors: Ruchika Goyal, Ashwani Kumar, Sandeep Jain

Abstract:

The experimental data for anticancer potential of curcumin nanoparticles was calculated by means of eclectic data. The optimal descriptors were examined using Monte Carlo method based CORAL SEA software. The statistical quality of the model is following: n = 14, R² = 0.6809, Q² = 0.5943, s = 0.175, MAE = 0.114, F = 26 (sub-training set), n =5, R²= 0.9529, Q² = 0.7982, s = 0.086, MAE = 0.068, F = 61, Av Rm² = 0.7601, ∆R²m = 0.0840, k = 0.9856 and kk = 1.0146 (test set) and n = 5, R² = 0.6075 (validation set). This data can be used to build predictive QSAR models for anticancer activity.

Keywords: anticancer potential, curcumin, model, nanoparticles, optimal descriptors, QSAR

Procedia PDF Downloads 310
23739 Static vs. Stream Mining Trajectories Similarity Measures

Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh

Abstract:

Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.

Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining

Procedia PDF Downloads 388
23738 A Qualitative Study Identifying the Complexities of Early Childhood Professionals' Use and Production of Data

Authors: Sara Bonetti

Abstract:

The use of quantitative data to support policies and justify investments has become imperative in many fields including the field of education. However, the topic of data literacy has only marginally touched the early care and education (ECE) field. In California, within the ECE workforce, there is a group of professionals working in policy and advocacy that use quantitative data regularly and whose educational and professional experiences have been neglected by existing research. This study aimed at analyzing these experiences in accessing, using, and producing quantitative data. This study utilized semi-structured interviews to capture the differences in educational and professional backgrounds, policy contexts, and power relations. The participants were three key professionals from county-level organizations and one working at a State Department to allow for a broader perspective at systems level. The study followed Núñez’s multilevel model of intersectionality. The key in Núñez’s model is the intersection of multiple levels of analysis and influence, from the individual to the system level, and the identification of institutional power dynamics that perpetuate the marginalization of certain groups within society. In a similar manner, this study looked at the dynamic interaction of different influences at individual, organizational, and system levels that might intersect and affect ECE professionals’ experiences with quantitative data. At the individual level, an important element identified was the participants’ educational background, as it was possible to observe a relationship between that and their positionality, both with respect to working with data and also with respect to their power within an organization and at the policy table. For example, those with a background in child development were aware of how their formal education failed to train them in the skills that are necessary to work in policy and advocacy, and especially to work with quantitative data, compared to those with a background in administration and/or business. At the organizational level, the interviews showed a connection between the participants’ position within the organization and their organization’s position with respect to others and their degree of access to quantitative data. This in turn affected their sense of empowerment and agency in dealing with data, such as shaping what data is collected and available. These differences reflected on the interviewees’ perceptions and expectations for the ECE workforce. For example, one of the interviewees pointed out that many ECE professionals happen to use data out of the necessity of the moment. This lack of intentionality is a cause for, and at the same time translates into missed training opportunities. Another interviewee pointed out issues related to the professionalism of the ECE workforce by remarking the inadequacy of ECE students’ training in working with data. In conclusion, Núñez’s model helped understand the different elements that affect ECE professionals’ experiences with quantitative data. In particular, what was clear is that these professionals are not being provided with the necessary support and that we are not being intentional in creating data literacy skills for them, despite what is asked of them and their work.

Keywords: data literacy, early childhood professionals, intersectionality, quantitative data

Procedia PDF Downloads 244
23737 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014

Authors: Alexiou Dimitra, Fragkaki Maria

Abstract:

The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.

Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics

Procedia PDF Downloads 504
23736 Dynamic Thin Film Morphology near the Contact Line of a Condensing Droplet: Nanoscale Resolution

Authors: Abbasali Abouei Mehrizi, Hao Wang

Abstract:

The thin film region is so important in heat transfer process due to its low thermal resistance. On the other hand, the dynamic contact angle is crucial boundary condition in numerical simulations. While different modeling contains different assumption of the microscopic contact angle, none of them has experimental evidence for their assumption, and the contact line movement mechanism still remains vague. The experimental investigation in complete wetting is more popular than partial wetting, especially in nanoscale resolution when there is sharp variation in thin film profile in partial wetting. In the present study, an experimental investigation of water film morphology near the triple phase contact line during the condensation is performed. The state-of-the-art tapping-mode atomic force microscopy (TM-AFM) was used to get the high-resolution film profile goes down to 2 nm from the contact line. The droplet was put in saturated chamber. The pristine silicon wafer was used as a smooth substrate. The substrate was heated by PI film heater. So the chamber would be over saturated by droplet evaporation. By turning off the heater, water vapor gradually started condensing on the droplet and the droplet advanced. The advancing speed was less than 20 nm/s. The dominant results indicate that in contrast to nonvolatile liquid, the film profile goes down straightly to the surface till 2 nm from the substrate. However, small bending has been observed below 20 nm, occasionally. So, it can be claimed that for the low condensation rate the microscopic contact angle equals to the optically detectable macroscopic contact angle. This result can be used to simplify the heat transfer modeling in partial wetting. The experimental result of the equality of microscopic and macroscopic contact angle can be used as a solid evidence for using this boundary condition in numerical simulation.

Keywords: advancing, condensation, microscopic contact angle, partial wetting

Procedia PDF Downloads 288
23735 Design of Low-Cost Water Purification System Using Activated Carbon

Authors: Nayan Kishore Giri, Ramakar Jha

Abstract:

Water is a major element for the life of all the mankind in the earth. India’s surface water flows through fourteen major streams. Indian rivers are the main source of potable water in India. In the eastern part of India many toxic hazardous metals discharged into the river from mining industries, which leads many deadly diseases to human being. So the potable water quality is very significant and vital concern at present as it is related with the present and future health perspective of the human race. Consciousness of health risks linked with unsafe water is still very low among the many rural and urban areas in India. Only about 7% of total Indian people using water purifier. This unhealthy situation of water is not only present in India but also present in many underdeveloped countries. The major reason behind this is the high cost of water purifier. This current study geared towards development of economical and efficient technology for the removal of maximum possible toxic metals and pathogen bacteria. The work involves the design of portable purification system and purifying material. In this design Coconut shell granular activated carbon(GAC) and polypropylene filter cloths were used in this system. The activated carbon is impregnated with Iron(Fe). Iron is used because it enhances the adsorption capacity of activated carbon. The thorough analysis of iron impregnated activated carbon(Fe-AC) is done by Scanning Electron Microscope (SEM), X-ray diffraction (XRD) , BET surface area test were done. Then 10 ppm of each toxic metal were infiltrated through the designed purification system and they were analysed in Atomic absorption spectrum (AAS). The results are very promising and it is low cost. This work will help many people who are in need of potable water. They can be benefited for its affordability. It could be helpful in industries and other domestic usage.

Keywords: potable water, coconut shell GAC, polypropylene filter cloths, SEM, XRD, BET, AAS

Procedia PDF Downloads 374
23734 Deployment of Electronic Healthcare Records and Development of Big Data Analytics Capabilities in the Healthcare Industry: A Systematic Literature Review

Authors: Tigabu Dagne Akal

Abstract:

Electronic health records (EHRs) can help to store, maintain, and make the appropriate handling of patient histories for proper treatment and decision. Merging the EHRs with big data analytics (BDA) capabilities enable healthcare stakeholders to provide effective and efficient treatments for chronic diseases. Though there are huge opportunities and efforts that exist in the deployment of EMRs and the development of BDA, there are challenges in addressing resources and organizational capabilities that are required to achieve the competitive advantage and sustainability of EHRs and BDA. The resource-based view (RBV), information system (IS), and non- IS theories should be extended to examine organizational capabilities and resources which are required for successful data analytics in the healthcare industries. The main purpose of this study is to develop a conceptual framework for the development of healthcare BDA capabilities based on past works so that researchers can extend. The research question was formulated for the search strategy as a research methodology. The study selection was made at the end. Based on the study selection, the conceptual framework for the development of BDA capabilities in the healthcare settings was formulated.

Keywords: EHR, EMR, Big data, Big data analytics, resource-based view

Procedia PDF Downloads 125
23733 Development of a Spatial Data for Renal Registry in Nigeria Health Sector

Authors: Adekunle Kolawole Ojo, Idowu Peter Adebayo, Egwuche Sylvester O.

Abstract:

Chronic Kidney Disease (CKD) is a significant cause of morbidity and mortality across developed and developing nations and is associated with increased risk. There are no existing electronic means of capturing and monitoring CKD in Nigeria. The work is aimed at developing a spatial data model that can be used to implement renal registries required for tracking and monitoring the spatial distribution of renal diseases by public health officers and patients. In this study, we have developed a spatial data model for a functional renal registry.

Keywords: renal registry, health informatics, chronic kidney disease, interface

Procedia PDF Downloads 193
23732 Development of Hybrid Materials Combining Biomass as Fique Fibers with Metal-Organic Frameworks, and Their Potential as Mercury Adsorbents

Authors: Karen G. Bastidas Gomez, Hugo R. Zea Ramirez, Manuel F. Ribeiro Pereira, Cesar A. Sierra Avila, Juan A. Clavijo Morales

Abstract:

The contamination of water sources with heavy metals such as mercury has been an environmental problem; it has generated a high impact on the environment and human health. In countries such as Colombia, mercury contamination due to mining has reached levels much higher than the world average. This work proposes the use of fique fibers as adsorbent in mercury removal. The evaluation of the material was carried out under five different conditions (raw, pretreated by organosolv, functionalized by TEMPO oxidation, fiber functionalized plus MOF-199 and fiber functionalized plus MOF-199-SH). All the materials were characterized using FTIR, SEM, EDX, XRD, and TGA. Regarding the mercury removal, it was done under room pressure and temperature, also pH = 7 for all materials presentations, followed by Atomic Absorption Spectroscopy. The high cellulose content in fique is the main particularity of this lignocellulosic biomass since the degree of oxidation depends on the number of hydroxyl groups on the surface capable of oxidizing into carboxylic acids, a functional group capable of increasing ion exchange with mercury in solution. It was also expected that the impregnation of the MOF would increase the mercury removal; however, it was found that the functionalized fique achieved a greater percentage of removal, resulting in 81.33% of removal, 44% for the fique with the MOF-199 and 72% for the MOF-199-SH with. The pretreated fiber and raw also showed 74% and 56%, respectively, which indicates that fique does not require considerable modifications in its structure to achieve good performances. Even so, the functionalized fiber increases the percentage of removal considerably compared to the pretreated fique, which suggests that the functionalization process is a feasible procedure to apply with the purpose of improving the removal percentage. In addition, this is a procedure that follows a green approach since the reagents involved have low environmental impact, and the contribution to the remediation of natural resources is high.

Keywords: biomass, nanotechnology, science materials, wastewater treatment

Procedia PDF Downloads 109
23731 The Diverse and Flexible Coping Strategies Simulation for Maanshan Nuclear Power Plant

Authors: Chin-Hsien Yeh, Shao-Wen Chen, Wen-Shu Huang, Chun-Fu Huang, Jong-Rong Wang, Jung-Hua Yang, Yuh-Ming Ferng, Chunkuan Shih

Abstract:

In this research, a Fukushima-like conditions is simulated with TRACE and RELAP5. Fukushima Daiichi Nuclear Power Plant (NPP) occurred the disaster which caused by the earthquake and tsunami. This disaster caused extended loss of all AC power (ELAP). Hence, loss of ultimate heat sink (LUHS) happened finally. In order to handle Fukushima-like conditions, Taiwan Atomic Energy Council (AEC) commanded that Taiwan Power Company should propose strategies to ensure the nuclear power plant safety. One of the diverse and flexible coping strategies (FLEX) is a different water injection strategy. It can execute core injection at 20 Kg/cm2 without depressurization. In this study, TRACE and RELAP5 were used to simulate Maanshan nuclear power plant, which is a three loops PWR in Taiwan, under Fukushima-like conditions and make sure the success criteria of FLEX. Reducing core cooling ability is due to failure of emergency core cooling system (ECCS) in extended loss of all AC power situation. The core water level continues to decline because of the seal leakage, and then FLEX is used to save the core water level and make fuel rods covered by water. The result shows that this mitigation strategy can cool the reactor pressure vessel (RPV) as soon as possible under Fukushima-like conditions, and keep the core water level higher than Top of Active Fuel (TAF). The FLEX can ensure the peak cladding temperature (PCT) below than the criteria 1088.7 K. Finally, the FLEX can provide protection for nuclear power plant and make plant safety.

Keywords: TRACE, RELAP5/MOD3.3, ELAP, FLEX

Procedia PDF Downloads 242
23730 Environmental Evaluation of Two Kind of Drug Production (Syrup and Pomade Form) Using Life Cycle Assessment Methodology

Authors: H. Aksas, S. Boughrara, K. Louhab

Abstract:

The goal of this study was the use of life cycle assessment (LCA) methodology to assess the environmental impact of pharmaceutical product (four kinds of syrup form and tree kinds of pomade form), which are produced in one leader manufactory in Algeria town that is SAIDAL Company. The impacts generated have evaluated using SimpaPro7.1 with CML92 Method for syrup form and EPD 2007 for pomade form. All impacts evaluated have compared between them, with determination of the compound contributing to each impacts in each case. Data needed to conduct Life Cycle Inventory (LCI) came from this factory, by the collection of theoretical data near the responsible technicians and engineers of the company, the practical data are resulting from the assay of pharmaceutical liquid, obtained at the laboratories of the university. This data represent different raw material imported from European and Asian country necessarily to formulate the drug. Energy used is coming from Algerian resource for the input. Outputs are the result of effluent analysis of this factory with different form (liquid, solid and gas form). All this data (input and output) represent the ecobalance.

Keywords: pharmaceutical product, drug residues, LCA methodology, environmental impacts

Procedia PDF Downloads 242
23729 Infrared Photodetectors Based on Nanowire Arrays: Towards Far Infrared Region

Authors: Mohammad Karimi, Magnus Heurlin, Lars Samuelson, Magnus Borgstrom, Hakan Pettersson

Abstract:

Nanowire semiconductors are promising candidates for optoelectronic applications such as solar cells, photodetectors and lasers due to their quasi-1D geometry and large surface to volume ratio. The functional wavelength range of NW-based detectors is typically limited to the visible/near-infrared region. In this work, we present electrical and optical properties of IR photodetectors based on large square millimeter ensembles (>1million) of vertically processed semiconductor heterostructure nanowires (NWs) grown on InP substrates which operate in longer wavelengths. InP NWs comprising single or multiple (20) InAs/InAsP QDics axially embedded in an n-i-n geometry, have been grown on InP substrates using metal organic vapor phase epitaxy (MOVPE). The NWs are contacted in vertical direction by atomic layer deposition (ALD) deposition of 50 nm SiO2 as an insulating layer followed by sputtering of indium tin oxide (ITO) and evaporation of Ti and Au as top contact layer. In order to extend the sensitivity range to the mid-wavelength and long-wavelength regions, the intersubband transition within conduction band of InAsP QDisc is suggested. We present first experimental indications of intersubband photocurrent in NW geometry and discuss important design parameters for realization of intersubband detectors. Key advantages with the proposed design include large degree of freedom in choice of materials compositions, possible enhanced optical resonance effects due to periodically ordered NW arrays and the compatibility with silicon substrates. We believe that the proposed detector design offers the route towards monolithic integration of compact and sensitive III-V NW long wavelength detectors with Si technology.

Keywords: intersubband photodetector, infrared, nanowire, quantum disc

Procedia PDF Downloads 371
23728 Characterization of Single-Walled Carbon Nano Tubes Forest Decorated with Chromium

Authors: Ana Paula Mousinho, Ronaldo D. Mansano, Nelson Ordonez

Abstract:

Carbon nanotubes are one of the main elements in nanotechnologies; their applications are in microelectronics, nano-electronics devices (photonics, spintronic), chemical sensors, structural material and currently in clean energy devices (supercapacitors and fuel cells). The use of magnetic particle decorated carbon nanotubes increases the applications in magnetic devices, magnetic memory, and magnetic oriented drug delivery. In this work, single-walled carbon nanotubes (CNTs) forest decorated with chromium were deposited at room temperature by high-density plasma chemical vapor deposition (HDPCVD) system. The CNTs forest was obtained using pure methane plasmas and chromium, as precursor material (seed) and for decorating the CNTs. Magnetron sputtering deposited the chromium on silicon wafers before the CNTs' growth. Scanning electron microscopy, atomic force microscopy, micro-Raman spectroscopy, and X-ray diffraction characterized the single-walled CNTs forest decorated with chromium. In general, the CNTs' spectra show a unique emission band, but due to the presence of the chromium, the spectra obtained in this work showed many bands that are related to the CNTs with different diameters. The CNTs obtained by the HDPCVD system are highly aligned and showed metallic features, and they can be used as photonic material, due to the unique structural and electrical properties. The results of this work proved the possibility of obtaining the controlled deposition of aligned single-walled CNTs forest films decorated with chromium by high-density plasma chemical vapor deposition system.

Keywords: CNTs forest, high density plasma deposition, high-aligned CNTs, nanomaterials

Procedia PDF Downloads 112
23727 Multi Cloud Storage Systems for Resource Constrained Mobile Devices: Comparison and Analysis

Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta

Abstract:

Cloud storage is a model of online data storage where data is stored in virtualized pool of servers hosted by third parties (CSPs) and located in different geographical locations. Cloud storage revolutionized the way how users access their data online anywhere, anytime and using any device as a tablet, mobile, laptop, etc. A lot of issues as vendor lock-in, frequent service outage, data loss and performance related issues exist in single cloud storage systems. So to evade these issues, the concept of multi cloud storage introduced. There are a lot of multi cloud storage systems exists in the market for mobile devices. In this article, we are providing comparison of four multi cloud storage systems for mobile devices Otixo, Unclouded, Cloud Fuze, and Clouds and evaluate their performance on the basis of CPU usage, battery consumption, time consumption and data usage parameters on three mobile phones Nexus 5, Moto G and Nexus 7 tablet and using Wi-Fi network. Finally, open research challenges and future scope are discussed.

Keywords: cloud storage, multi cloud storage, vendor lock-in, mobile devices, mobile cloud computing

Procedia PDF Downloads 395
23726 Preparation of Wireless Networks and Security; Challenges in Efficient Accession of Encrypted Data in Healthcare

Authors: M. Zayoud, S. Oueida, S. Ionescu, P. AbiChar

Abstract:

Background: Wireless sensor network is encompassed of diversified tools of information technology, which is widely applied in a range of domains, including military surveillance, weather forecasting, and earthquake forecasting. Strengthened grounds are always developed for wireless sensor networks, which usually emerges security issues during professional application. Thus, essential technological tools are necessary to be assessed for secure aggregation of data. Moreover, such practices have to be incorporated in the healthcare practices that shall be serving in the best of the mutual interest Objective: Aggregation of encrypted data has been assessed through homomorphic stream cipher to assure its effectiveness along with providing the optimum solutions to the field of healthcare. Methods: An experimental design has been incorporated, which utilized newly developed cipher along with CPU-constrained devices. Modular additions have also been employed to evaluate the nature of aggregated data. The processes of homomorphic stream cipher have been highlighted through different sensors and modular additions. Results: Homomorphic stream cipher has been recognized as simple and secure process, which has allowed efficient aggregation of encrypted data. In addition, the application has led its way to the improvisation of the healthcare practices. Statistical values can be easily computed through the aggregation on the basis of selected cipher. Sensed data in accordance with variance, mean, and standard deviation has also been computed through the selected tool. Conclusion: It can be concluded that homomorphic stream cipher can be an ideal tool for appropriate aggregation of data. Alongside, it shall also provide the best solutions to the healthcare sector.

Keywords: aggregation, cipher, homomorphic stream, encryption

Procedia PDF Downloads 249
23725 The Relationship between Emotional Intelligence and Leadership Performance

Authors: Omar Al Ali

Abstract:

The current study was aimed to explore the relationships between emotional intelligence, cognitive ability, and leader's performance. Data were collected from 260 senior managers from UAE. The results showed that there are significant relationships between emotional intelligence and leadership performance as measured by the annual internal evaluations of each participant (r = .42, p < .01). Data from regression analysis revealed that both variables namely emotional intelligence (beta = .31, p < .01), and cognitive ability (beta = .29, p < .01), predicted leadership competencies, and together explained 26% of its variance. Data suggests that EI and cognitive ability are significantly correlated with leadership performance. In depth implications of the present findings for human resource development theory and practice are discussed.

Keywords: emotional intelligence, cognitive ability, leadership, performance

Procedia PDF Downloads 468
23724 Comparison of Irradiance Decomposition and Energy Production Methods in a Solar Photovoltaic System

Authors: Tisciane Perpetuo e Oliveira, Dante Inga Narvaez, Marcelo Gradella Villalva

Abstract:

Installations of solar photovoltaic systems have increased considerably in the last decade. Therefore, it has been noticed that monitoring of meteorological data (solar irradiance, air temperature, wind velocity, etc.) is important to predict the potential of a given geographical area in solar energy production. In this sense, the present work compares two computational tools that are capable of estimating the energy generation of a photovoltaic system through correlation analyzes of solar radiation data: PVsyst software and an algorithm based on the PVlib package implemented in MATLAB. In order to achieve the objective, it was necessary to obtain solar radiation data (measured and from a solarimetric database), analyze the decomposition of global solar irradiance in direct normal and horizontal diffuse components, as well as analyze the modeling of the devices of a photovoltaic system (solar modules and inverters) for energy production calculations. Simulated results were compared with experimental data in order to evaluate the performance of the studied methods. Errors in estimation of energy production were less than 30% for the MATLAB algorithm and less than 20% for the PVsyst software.

Keywords: energy production, meteorological data, irradiance decomposition, solar photovoltaic system

Procedia PDF Downloads 133
23723 Chitosan-Aluminum Monostearate Dispersion as Fabricating Liquid for Constructing Controlled Drug Release Matrix

Authors: Kotchamon Yodkhum, Thawatchai Phaechamud

Abstract:

Hydrophobic chitosan-based materials have been developed as controlled drug delivery system. This study was aimed to prepare and evaluate chitosan-aluminum monostearate composite dispersion (CLA) as fabricating liquid for construct a hydrophobic, controlled-release solid drug delivery matrix. This work was attempted to blend hydrophobic substance, aluminum monostearate (AMS), with chitosan in acidic aqueous medium without using any surfactants or grafting reaction, and high temperature during mixing that are normally performed when preparing hydrophobic chitosan system. Lactic acid solution (2%w/v) was employed as chitosan solvent. CLA dispersion was prepared by dispersing different amounts of AMS (1-20% w/w) in chitosan solution (4% w/w) with continuous agitation using magnetic stirrer for 24 h. Effect of AMS amount on physicochemical properties of the dispersion such as viscosity, rheology and particle size was evaluated. Morphology of chitosan-AMS complex (dispersant) was observed under inverted microscope and atomic force microscope. Stability of CLA dispersions was evaluated after preparation within 48 h. CLA dispersions containing AMS less than 5 % w/w exhibited rheological behavior as Newtonian while that containing higher AMS amount exhibited as pseudoplastic. Particle size of the dispersant was significantly smaller when AMS amount was increased up to 5% w/w and was not different between the higher AMS amount system. Morphology of the dispersant under inverted microscope displayed irregular shape and their size exhibited the same trend with particle size measurement. Observation of the dispersion stability revealed that phase separation occurred faster in the system containing higher AMS amount which indicated lower stability of the system. However, the dispersions were homogeneous and stable more than 12 hours after preparation that enough for fabrication process. The prepared dispersions had ability to be fabricated as a porous matrix via lyophilization technique.

Keywords: chitosan, aluminum monostearate, dispersion, controlled-release

Procedia PDF Downloads 380
23722 Social Media Data Analysis for Personality Modelling and Learning Styles Prediction Using Educational Data Mining

Authors: Srushti Patil, Preethi Baligar, Gopalkrishna Joshi, Gururaj N. Bhadri

Abstract:

In designing learning environments, the instructional strategies can be tailored to suit the learning style of an individual to ensure effective learning. In this study, the information shared on social media like Facebook is being used to predict learning style of a learner. Previous research studies have shown that Facebook data can be used to predict user personality. Users with a particular personality exhibit an inherent pattern in their digital footprint on Facebook. The proposed work aims to correlate the user's’ personality, predicted from Facebook data to the learning styles, predicted through questionnaires. For Millennial learners, Facebook has become a primary means for information sharing and interaction with peers. Thus, it can serve as a rich bed for research and direct the design of learning environments. The authors have conducted this study in an undergraduate freshman engineering course. Data from 320 freshmen Facebook users was collected. The same users also participated in the learning style and personality prediction survey. The Kolb’s Learning style questionnaires and Big 5 personality Inventory were adopted for the survey. The users have agreed to participate in this research and have signed individual consent forms. A specific page was created on Facebook to collect user data like personal details, status updates, comments, demographic characteristics and egocentric network parameters. This data was captured by an application created using Python program. The data captured from Facebook was subjected to text analysis process using the Linguistic Inquiry and Word Count dictionary. An analysis of the data collected from the questionnaires performed reveals individual student personality and learning style. The results obtained from analysis of Facebook, learning style and personality data were then fed into an automatic classifier that was trained by using the data mining techniques like Rule-based classifiers and Decision trees. This helps to predict the user personality and learning styles by analysing the common patterns. Rule-based classifiers applied for text analysis helps to categorize Facebook data into positive, negative and neutral. There were totally two models trained, one to predict the personality from Facebook data; another one to predict the learning styles from the personalities. The results show that the classifier model has high accuracy which makes the proposed method to be a reliable one for predicting the user personality and learning styles.

Keywords: educational data mining, Facebook, learning styles, personality traits

Procedia PDF Downloads 221
23721 Talent-to-Vec: Using Network Graphs to Validate Models with Data Sparsity

Authors: Shaan Khosla, Jon Krohn

Abstract:

In a recruiting context, machine learning models are valuable for recommendations: to predict the best candidates for a vacancy, to match the best vacancies for a candidate, and compile a set of similar candidates for any given candidate. While useful to create these models, validating their accuracy in a recommendation context is difficult due to a sparsity of data. In this report, we use network graph data to generate useful representations for candidates and vacancies. We use candidates and vacancies as network nodes and designate a bi-directional link between them based on the candidate interviewing for the vacancy. After using node2vec, the embeddings are used to construct a validation dataset with a ranked order, which will help validate new recommender systems.

Keywords: AI, machine learning, NLP, recruiting

Procedia PDF Downloads 80
23720 A Web Service-Based Framework for Mining E-Learning Data

Authors: Felermino D. M. A. Ali, S. C. Ng

Abstract:

E-learning is an evolutionary form of distance learning and has become better over time as new technologies emerged. Today, efforts are still being made to embrace E-learning systems with emerging technologies in order to make them better. Among these advancements, Educational Data Mining (EDM) is one that is gaining a huge and increasing popularity due to its wide application for improving the teaching-learning process in online practices. However, even though EDM promises to bring many benefits to educational industry in general and E-learning environments in particular, its principal drawback is the lack of easy to use tools. The current EDM tools usually require users to have some additional technical expertise to effectively perform EDM tasks. Thus, in response to these limitations, this study intends to design and implement an EDM application framework which aims at automating and simplify the development of EDM in E-learning environment. The application framework introduces a Service-Oriented Architecture (SOA) that hides the complexity of technical details and enables users to perform EDM in an automated fashion. The framework was designed based on abstraction, extensibility, and interoperability principles. The framework implementation was made up of three major modules. The first module provides an abstraction for data gathering, which was done by extending Moodle LMS (Learning Management System) source code. The second module provides data mining methods and techniques as services; it was done by converting Weka API into a set of Web services. The third module acts as an intermediary between the first two modules, it contains a user-friendly interface that allows dynamically locating data provider services, and running knowledge discovery tasks on data mining services. An experiment was conducted to evaluate the overhead of the proposed framework through a combination of simulation and implementation. The experiments have shown that the overhead introduced by the SOA mechanism is relatively small, therefore, it has been concluded that a service-oriented architecture can be effectively used to facilitate educational data mining in E-learning environments.

Keywords: educational data mining, e-learning, distributed data mining, moodle, service-oriented architecture, Weka

Procedia PDF Downloads 230
23719 Adsorption of Atmospheric Gases Using Atomic Clusters

Authors: Vidula Shevade, B. J. Nagare, Sajeev Chacko

Abstract:

First principles simulation, meaning density functional theory (DFT) calculations with plane waves and pseudopotential, has become a prized technique in condensed matter theory. Nanoparticles (NP) have been known to possess good catalytic activities, especially for molecules such as CO, O₂, etc. Among the metal NPs, Aluminium based NPs are also widely known for their catalytic properties. Aluminium metal is a lightweight, excellent electrical, and thermal abundant chemical element in the earth’s crust. Aluminium NPs, when added to solid rocket fuel, help improve the combustion speed and considerably increase combustion heat and combustion stability. Adding aluminium NPs into normal Al/Al₂O₃ powder improves the sintering processes of the ceramics, with high heat transfer performance, increased density, and enhanced thermal conductivity of the sinter. We used VASP and Gaussian 0₃ package to compute the geometries, electronic structure, and bonding properties of Al₁₂Ni as well as its interaction with O₂ and CO molecules. Several MD simulations were carried out using VASP at various temperatures from which hundreds of structures were optimized, leading to 24 unique structures. These structures were then further optimized through a Gaussian package. The lowest energy structure of Al₁₂Ni has been reported to be a singlet. However, through our extensive search, we found a triplet state to be lower in energy. In our structure, the Ni atom is found to be on the surface, which gives the non-zero magnetic moment. Incidentally, O2 and CO molecules are also triplet in nature, due to which the Al₁₂-Ni cluster is likely to facilitate the oxidation process of the CO molecule. Our results show that the most favourable site for the CO molecule is the Ni atom and that for the O₂ molecule is the Al atom that is nearest to the Ni atom. Al₁₂Ni-O₂ and Al₁₂-Ni-CO structures we extracted using VMD. Al₁₂Ni nanocluster, due to in triplet electronic structure configuration, indicates it to be a potential candidate as a catalyst for oxidation of CO molecules.

Keywords: catalyst, gaussian, nanoparticles, oxidation

Procedia PDF Downloads 88
23718 Mathematics Bridging Theory and Applications for a Data-Driven World

Authors: Zahid Ullah, Atlas Khan

Abstract:

In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.

Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models

Procedia PDF Downloads 64
23717 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0

Authors: Harris Niavis, Dimitra Politaki

Abstract:

The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.

Keywords: blockchain, data quality, industry4.0, product quality

Procedia PDF Downloads 175
23716 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction

Authors: Qais M. Yousef, Yasmeen A. Alshaer

Abstract:

Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.

Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization

Procedia PDF Downloads 169
23715 IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG

Authors: R. Gautam, P. Sastha Kanagasabai, G. N. Rathna

Abstract:

The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service.

Keywords: brain computer interface, Intel Edison, Emotiv EPOC, IoT analytics, electroencephalogram

Procedia PDF Downloads 181
23714 Searchable Encryption in Cloud Storage

Authors: Ren Junn Hwang, Chung-Chien Lu, Jain-Shing Wu

Abstract:

Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.

Keywords: fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption

Procedia PDF Downloads 373
23713 A Bivariate Inverse Generalized Exponential Distribution and Its Applications in Dependent Competing Risks Model

Authors: Fatemah A. Alqallaf, Debasis Kundu

Abstract:

The aim of this paper is to introduce a bivariate inverse generalized exponential distribution which has a singular component. The proposed bivariate distribution can be used when the marginals have heavy-tailed distributions, and they have non-monotone hazard functions. Due to the presence of the singular component, it can be used quite effectively when there are ties in the data. Since it has four parameters, it is a very flexible bivariate distribution, and it can be used quite effectively for analyzing various bivariate data sets. Several dependency properties and dependency measures have been obtained. The maximum likelihood estimators cannot be obtained in closed form, and it involves solving a four-dimensional optimization problem. To avoid that, we have proposed to use an EM algorithm, and it involves solving only one non-linear equation at each `E'-step. Hence, the implementation of the proposed EM algorithm is very straight forward in practice. Extensive simulation experiments and the analysis of one data set have been performed. We have observed that the proposed bivariate inverse generalized exponential distribution can be used for modeling dependent competing risks data. One data set has been analyzed to show the effectiveness of the proposed model.

Keywords: Block and Basu bivariate distributions, competing risks, EM algorithm, Marshall-Olkin bivariate exponential distribution, maximum likelihood estimators

Procedia PDF Downloads 133
23712 Blind Data Hiding Technique Using Interpolation of Subsampled Images

Authors: Singara Singh Kasana, Pankaj Garg

Abstract:

In this paper, a blind data hiding technique based on interpolation of sub sampled versions of a cover image is proposed. Sub sampled image is taken as a reference image and an interpolated image is generated from this reference image. Then difference between original cover image and interpolated image is used to embed secret data. Comparisons with the existing interpolation based techniques show that proposed technique provides higher embedding capacity and better visual quality marked images. Moreover, the performance of the proposed technique is more stable for different images.

Keywords: interpolation, image subsampling, PSNR, SIM

Procedia PDF Downloads 570