Search results for: applications of big data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29429

Search results for: applications of big data

25859 Non-Linear Regression Modeling for Composite Distributions

Authors: Mostafa Aminzadeh, Min Deng

Abstract:

Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.

Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions

Procedia PDF Downloads 18
25858 Risks beyond Cyber in IoT Infrastructure and Services

Authors: Mattias Bergstrom

Abstract:

Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.

Keywords: IoT, security, infrastructure, SCADA, blockchain, AI

Procedia PDF Downloads 101
25857 Machine Learning Techniques to Predict Cyberbullying and Improve Social Work Interventions

Authors: Oscar E. Cariceo, Claudia V. Casal

Abstract:

Machine learning offers a set of techniques to promote social work interventions and can lead to support decisions of practitioners in order to predict new behaviors based on data produced by the organizations, services agencies, users, clients or individuals. Machine learning techniques include a set of generalizable algorithms that are data-driven, which means that rules and solutions are derived by examining data, based on the patterns that are present within any data set. In other words, the goal of machine learning is teaching computers through 'examples', by training data to test specifics hypothesis and predict what would be a certain outcome, based on a current scenario and improve that experience. Machine learning can be classified into two general categories depending on the nature of the problem that this technique needs to tackle. First, supervised learning involves a dataset that is already known in terms of their output. Supervising learning problems are categorized, into regression problems, which involve a prediction from quantitative variables, using a continuous function; and classification problems, which seek predict results from discrete qualitative variables. For social work research, machine learning generates predictions as a key element to improving social interventions on complex social issues by providing better inference from data and establishing more precise estimated effects, for example in services that seek to improve their outcomes. This paper exposes the results of a classification algorithm to predict cyberbullying among adolescents. Data were retrieved from the National Polyvictimization Survey conducted by the government of Chile in 2017. A logistic regression model was created to predict if an adolescent would experience cyberbullying based on the interaction and behavior of gender, age, grade, type of school, and self-esteem sentiments. The model can predict with an accuracy of 59.8% if an adolescent will suffer cyberbullying. These results can help to promote programs to avoid cyberbullying at schools and improve evidence based practice.

Keywords: cyberbullying, evidence based practice, machine learning, social work research

Procedia PDF Downloads 166
25856 Herbal Medicines Used for the Cure of Jaundice among the Some Tribal Populations of Madhya Pradesh, India

Authors: Awdhesh Narayan Sharma

Abstract:

The use of herbal medicines for the cure of various ailments among the tribal population is as old as human origin itself. Most of the tribal populations of Madhya Pradesh inhabit in remote and inaccessible ecological setup. From long back, tribals and forests are interrelated to each other. They use an enormous range of wild plants for their basic needs and medicines. The tribal developed a unique understanding with wild plants, herbs, etc., and earned specialized knowledge of disease pattern and curative therapy-through hard experiences, common sense, trial, and error methods. They have passed this knowledge through traditions, taboos, totems, folklore by words of mouth from generation to generation. Here, an attempt has been made to study the possible aspects of herbal medicine for the cure of Jaundice among the tribal populations of Madhya Pradesh, India, through primary data as well as available secondary data. The data have been collected from the 305 Bharias of Patalkot, Madhya Pradesh, India, and included available secondary source of data by various investigators. It may be concluded that a sizable herbal medicinal plants' wealth exists in Madhya Pradesh, India, which still awaits for scientific exploration. The existing herbal medicines used for the cure of jaundice need an extensive investigation from the pharmaceutical point of view.

Keywords: Bharias, herbal medicine, tribal, Madhya Pradesh

Procedia PDF Downloads 170
25855 Characterization of Internet Exchange Points by Using Quantitative Data

Authors: Yamba Dabone, Tounwendyam Frédéric Ouedraogo, Pengwendé Justin Kouraogo, Oumarou Sie

Abstract:

Reliable data transport over the Internet is one of the goals of researchers in the field of computer science. Data such as videos and audio files are becoming increasingly large. As a result, transporting them over the Internet is becoming difficult. Therefore, it has been important to establish a method to locally interconnect autonomous systems (AS) with each other to facilitate traffic exchange. It is in this context that Internet Exchange Points (IXPs) are set up to facilitate local and even regional traffic. They are now the lifeblood of the Internet. Therefore, it is important to think about the factors that can characterize IXPs. However, other more quantifiable characteristics can help determine the quality of an IXP. In addition, these characteristics may allow ISPs to have a clearer view of the exchange node and may also convince other networks to connect to an IXP. To that end, we define five new IXP characteristics: the attraction rate (τₐₜₜᵣ); and the peering rate (τₚₑₑᵣ); the target rate of an IXP (Objₐₜₜ); the number of IXP links (Nₗᵢₙₖ); the resistance rate τₑ𝒻𝒻 and the attraction failure rate (τ𝒻).

Keywords: characteristic, autonomous system, internet service provider, internet exchange point, rate

Procedia PDF Downloads 89
25854 Statistic Regression and Open Data Approach for Identifying Economic Indicators That Influence e-Commerce

Authors: Apollinaire Barme, Simon Tamayo, Arthur Gaudron

Abstract:

This paper presents a statistical approach to identify explanatory variables linearly related to e-commerce sales. The proposed methodology allows specifying a regression model in order to quantify the relevance between openly available data (economic and demographic) and national e-commerce sales. The proposed methodology consists in collecting data, preselecting input variables, performing regressions for choosing variables and models, testing and validating. The usefulness of the proposed approach is twofold: on the one hand, it allows identifying the variables that influence e- commerce sales with an accessible approach. And on the other hand, it can be used to model future sales from the input variables. Results show that e-commerce is linearly dependent on 11 economic and demographic indicators.

Keywords: e-commerce, statistical modeling, regression, empirical research

Procedia PDF Downloads 219
25853 A Reasoning Method of Cyber-Attack Attribution Based on Threat Intelligence

Authors: Li Qiang, Yang Ze-Ming, Liu Bao-Xu, Jiang Zheng-Wei

Abstract:

With the increasing complexity of cyberspace security, the cyber-attack attribution has become an important challenge of the security protection systems. The difficult points of cyber-attack attribution were forced on the problems of huge data handling and key data missing. According to this situation, this paper presented a reasoning method of cyber-attack attribution based on threat intelligence. The method utilizes the intrusion kill chain model and Bayesian network to build attack chain and evidence chain of cyber-attack on threat intelligence platform through data calculation, analysis and reasoning. Then, we used a number of cyber-attack events which we have observed and analyzed to test the reasoning method and demo system, the result of testing indicates that the reasoning method can provide certain help in cyber-attack attribution.

Keywords: reasoning, Bayesian networks, cyber-attack attribution, Kill Chain, threat intelligence

Procedia PDF Downloads 444
25852 Material Concepts and Processing Methods for Electrical Insulation

Authors: R. Sekula

Abstract:

Epoxy composites are broadly used as an electrical insulation for the high voltage applications since only such materials can fulfill particular mechanical, thermal, and dielectric requirements. However, properties of the final product are strongly dependent on proper manufacturing process with minimized material failures, as too large shrinkage, voids and cracks. Therefore, application of proper materials (epoxy, hardener, and filler) and process parameters (mold temperature, filling time, filling velocity, initial temperature of internal parts, gelation time), as well as design and geometric parameters are essential features for final quality of the produced components. In this paper, an approach for three-dimensional modeling of all molding stages, namely filling, curing and post-curing is presented. The reactive molding simulation tool is based on a commercial CFD package, and include dedicated models describing viscosity and reaction kinetics that have been successfully implemented to simulate the reactive nature of the system with exothermic effect. Also a dedicated simulation procedure for stress and shrinkage calculations, as well as simulation results are presented in the paper. Second part of the paper is dedicated to recent developments on formulations of functional composites for electrical insulation applications, focusing on thermally conductive materials. Concepts based on filler modifications for epoxy electrical composites have been presented, including the results of the obtained properties. Finally, having in mind tough environmental regulations, in addition to current process and design aspects, an approach for product re-design has been presented focusing on replacement of epoxy material with the thermoplastic one. Such “design-for-recycling” method is one of new directions associated with development of new material and processing concepts of electrical products and brings a lot of additional research challenges. For that, one of the successful products has been presented to illustrate the presented methodology.

Keywords: curing, epoxy insulation, numerical simulations, recycling

Procedia PDF Downloads 271
25851 A Pre-Assessment Questionnaire to Identify Healthcare Professionals’ Perception on Information Technology Implementation

Authors: Y. Atilgan Şengül

Abstract:

Health information technologies promise higher quality, safer care and much more for both patients and professionals. Despite their promise, they are costly to develop and difficult to implement. On the other hand, user acceptance and usage determine the success of implemented information technology in healthcare. This study provides a model to understand health professionals’ perception and expectation of health information technology. Extensive literature review has been conducted to determine the main factors to be measured. A questionnaire has been designed as a measurement model and submitted to the personnel of an in vitro fertilization clinic. The respondents’ degree of agreement according to five-point Likert scale was 72% for convenient access to data and 69.4% for the importance of data security. There was a significant difference in acceptance of electronic data storage for female respondents. Also, other significant differences between professions were obtained.

Keywords: healthcare, health informatics, medical record system, questionnaire

Procedia PDF Downloads 168
25850 Validation of Electrical Field Effect on Electrostatic Desalter Modeling with Experimental Laboratory Data

Authors: Fatemeh Yazdanmehr, Iulian Nistor

Abstract:

The scope of the current study is the evaluation of the electric field effect on electrostatic desalting mathematical modeling with laboratory data. This research study was focused on developing a model for an existing operation desalting unit of one of the Iranian heavy oil field with a 75 MBPD production capacity. The high temperature of inlet oil to dehydration unit reduces the oil recovery, so the mathematical modeling of desalter operation parameters is very significant. The existing production unit operating data has been used for the accuracy of the mathematical desalting plant model. The inlet oil temperature to desalter was decreased from 110 to 80°C, and the desalted electrical field was increased from 0.75 to 2.5 Kv/cm. The model result shows that the desalter parameter changes meet the water-oil specification and also the oil production and consequently annual income is increased. In addition to that, changing desalter operation conditions reduces environmental footprint because of flare gas reduction. Following to specify the accuracy of selected electrostatic desalter electrical field, laboratory data has been used. Experimental data are used to ensure the effect of electrical field change on desalter. Therefore, the lab test is done on a crude oil sample. The results include the dehydration efficiency in the presence of a demulsifier and under electrical field (0.75 Kv) conditions at various temperatures. Comparing lab experimental and electrostatic desalter mathematical model results shows 1-3 percent acceptable error which confirms the validity of desalter specification and operation conditions changes.

Keywords: desalter, electrical field, demulsification, mathematical modeling, water-oil separation

Procedia PDF Downloads 129
25849 Formation of Round Channel for Microfluidic Applications

Authors: A. Zahra, G. de Cesare, D. Caputo, A. Nascetti

Abstract:

PDMS (Polydimethylsiloxane) polymer is a suitable material for biological and MEMS (Microelectromechanical systems) designers, because of its biocompatibility, transparency and high resistance under plasma treatment. PDMS round channel is always been of great interest due to its ability to confine the liquid with membrane type micro valves. In this paper we are presenting a very simple way to form round shape microfluidic channel, which is based on reflow of positive photoresist AZ® 40 XT. With this method, it is possible to obtain channel of different height simply by varying the spin coating parameters of photoresist.

Keywords: lab-on-chip, PDMS, reflow, round microfluidic channel

Procedia PDF Downloads 423
25848 Isolation Preserving Medical Conclusion Hold Structure via C5 Algorithm

Authors: Swati Kishor Zode, Rahul Ambekar

Abstract:

Data mining is the extraction of fascinating examples on the other hand information from enormous measure of information and choice is made as indicated by the applicable information extracted. As of late, with the dangerous advancement in internet, stockpiling of information and handling procedures, privacy preservation has been one of the major (higher) concerns in data mining. Various techniques and methods have been produced for protection saving data mining. In the situation of Clinical Decision Support System, the choice is to be made on the premise of the data separated from the remote servers by means of Internet to diagnose the patient. In this paper, the fundamental thought is to build the precision of Decision Support System for multiple diseases for different maladies and in addition protect persistent information while correspondence between Clinician side (Client side) also, the Server side. A privacy preserving protocol for clinical decision support network is proposed so that patients information dependably stay scrambled amid diagnose prepare by looking after the accuracy. To enhance the precision of Decision Support System for various malady C5.0 classifiers and to save security, a Homomorphism encryption algorithm Paillier cryptosystem is being utilized.

Keywords: classification, homomorphic encryption, clinical decision support, privacy

Procedia PDF Downloads 327
25847 A Survey on the Status of Test Automation

Authors: Andrei Contan, Richard Torkar

Abstract:

Aim: The process of test automation and its practices in industry have to be better understood, both for the industry itself and for the research community. Method: We conducted a quantitative industry survey by asking IT professionals to answer questions related to the area of test automation. Results: Test automation needs and practices vary greatly between organizations at different stages of the software development life cycle. Conclusions: Most of the findings are general test automation challenges and are specific to small- to medium-sized companies, developing software applications in the web, desktop or mobile domain.

Keywords: survey, testing, test automation, status of test automation

Procedia PDF Downloads 649
25846 Framework to Quantify Customer Experience

Authors: Anant Sharma, Ashwin Rajan

Abstract:

Customer experience is measured today based on defining a set of metrics and KPIs, setting up thresholds and defining triggers across those thresholds. While this is an effective way of measuring against a Key Performance Indicator ( referred to as KPI in the rest of the paper ), this approach cannot capture the various nuances that make up the overall customer experience. Customers consume a product or service at various levels, which is not reflected in metrics like Customer Satisfaction or Net Promoter Score, but also across other measurements like recurring revenue, frequency of service usage, e-learning and depth of usage. Here we explore an alternative method of measuring customer experience by flipping the traditional views. Rather than rolling customers up to a metric, we roll up metrics to hierarchies and then measure customer experience. This method allows any team to quantify customer experience across multiple touchpoints in a customer’s journey. We make use of various data sources which contain information for metrics like CXSAT, NPS, Renewals, and depths of service usage collected across a customer lifecycle. This data can be mined systematically to get linkages between different data points like geographies, business groups, products and time. Additional views can be generated by blending synthetic contexts into the data to show trends and top/bottom types of reports. We have created a framework that allows us to measure customer experience using the above logic.

Keywords: analytics, customers experience, BI, business operations, KPIs, metrics

Procedia PDF Downloads 68
25845 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement

Authors: Rhadinia Tayag-Relanes, Felina C. Young

Abstract:

This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.

Keywords: continuous improvement, process, operations, PDCA

Procedia PDF Downloads 59
25844 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 220
25843 Modelling the Indonesian Goverment Securities Yield Curve Using Nelson-Siegel-Svensson and Support Vector Regression

Authors: Jamilatuzzahro, Rezzy Eko Caraka

Abstract:

The yield curve is the plot of the yield to maturity of zero-coupon bonds against maturity. In practice, the yield curve is not observed but must be extracted from observed bond prices for a set of (usually) incomplete maturities. There exist many methodologies and theory to analyze of yield curve. We use two methods (the Nelson-Siegel Method, the Svensson Method, and the SVR method) in order to construct and compare our zero-coupon yield curves. The objectives of this research were: (i) to study the adequacy of NSS model and SVR to Indonesian government bonds data, (ii) to choose the best optimization or estimation method for NSS model and SVR. To obtain that objective, this research was done by the following steps: data preparation, cleaning or filtering data, modeling, and model evaluation.

Keywords: support vector regression, Nelson-Siegel-Svensson, yield curve, Indonesian government

Procedia PDF Downloads 238
25842 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data

Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L. Duan

Abstract:

The conditional density characterizes the distribution of a response variable y given other predictor x and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts as a motivating starting point. In this work, the authors extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zₚ, zₙ]. The zₚ component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zₙ component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach coined Augmented Posterior CDE (AP-CDE) only requires a simple modification of the common normalizing flow framework while significantly improving the interpretation of the latent component since zₚ represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of 𝑥-related variations due to factors such as lighting condition and subject id from the other random variations. Further, the experiments show that an unconditional NF neural network based on an unsupervised model of z, such as a Gaussian mixture, fails to generate interpretable results.

Keywords: conditional density estimation, image generation, normalizing flow, supervised dimension reduction

Procedia PDF Downloads 89
25841 Soliton Solutions in (3+1)-Dimensions

Authors: Magdy G. Asaad

Abstract:

Solitons are among the most beneficial solutions for science and technology for their applicability in physical applications including plasma, energy transport along protein molecules, wave transport along poly-acetylene molecules, ocean waves, constructing optical communication systems, transmission of information through optical fibers and Josephson junctions. In this talk, we will apply the bilinear technique to generate a class of soliton solutions to the (3+1)-dimensional nonlinear soliton equation of Jimbo-Miwa type. Examples of the resulting soliton solutions are computed and a few solutions are plotted.

Keywords: Pfaffian solutions, N-soliton solutions, soliton equations, Jimbo-Miwa

Procedia PDF Downloads 447
25840 Influencers of E-Learning Readiness among Palestinian Secondary School Teachers: An Explorative Study

Authors: Fuad A. A. Trayek, Tunku Badariah Tunku Ahmad, Mohamad Sahari Nordin, Mohammed AM Dwikat

Abstract:

This paper reports on the results of an exploratory factor analysis procedure applied on the e-learning readiness data obtained from a survey of four hundred and seventy-nine (N = 479) teachers from secondary schools in Nablus, Palestine. The data were drawn from a 23-item Likert questionnaire measuring e-learning readiness based on Chapnick's conception of the construct. Principal axis factoring (PAF) with Promax rotation applied on the data extracted four distinct factors supporting four of Chapnick's e-learning readiness dimensions, namely technological readiness, psychological readiness, infrastructure readiness and equipment readiness. Together these four dimensions explained 56% of the variance. These findings provide further support for the construct validity of the items and for the existence of these four factors that measure e-learning readiness.

Keywords: e-learning, e-learning readiness, technological readiness, psychological readiness, principal axis factoring

Procedia PDF Downloads 393
25839 A Deep Learning Model with Greedy Layer-Wise Pretraining Approach for Optimal Syngas Production by Dry Reforming of Methane

Authors: Maryam Zarabian, Hector Guzman, Pedro Pereira-Almao, Abraham Fapojuwo

Abstract:

Dry reforming of methane (DRM) has sparked significant industrial and scientific interest not only as a viable alternative for addressing the environmental concerns of two main contributors of the greenhouse effect, i.e., carbon dioxide (CO₂) and methane (CH₄), but also produces syngas, i.e., a mixture of hydrogen (H₂) and carbon monoxide (CO) utilized by a wide range of downstream processes as a feedstock for other chemical productions. In this study, we develop an AI-enable syngas production model to tackle the problem of achieving an equivalent H₂/CO ratio [1:1] with respect to the most efficient conversion. Firstly, the unsupervised density-based spatial clustering of applications with noise (DBSAN) algorithm removes outlier data points from the original experimental dataset. Then, random forest (RF) and deep neural network (DNN) models employ the error-free dataset to predict the DRM results. DNN models inherently would not be able to obtain accurate predictions without a huge dataset. To cope with this limitation, we employ reusing pre-trained layers’ approaches such as transfer learning and greedy layer-wise pretraining. Compared to the other deep models (i.e., pure deep model and transferred deep model), the greedy layer-wise pre-trained deep model provides the most accurate prediction as well as similar accuracy to the RF model with R² values 1.00, 0.999, 0.999, 0.999, 0.999, and 0.999 for the total outlet flow, H₂/CO ratio, H₂ yield, CO yield, CH₄ conversion, and CO₂ conversion outputs, respectively.

Keywords: artificial intelligence, dry reforming of methane, artificial neural network, deep learning, machine learning, transfer learning, greedy layer-wise pretraining

Procedia PDF Downloads 82
25838 Novel Design of Quantum Dot Arrays to Enhance Near-Fields Excitation Resonances

Authors: Nour Hassan Ismail, Abdelmonem Nassar, Khaled Baz

Abstract:

Semiconductor crystals smaller than about 10 nm, known as quantum dots, have properties that differ from large samples, including a band gap that becomes larger for smaller particles. These properties create several applications for quantum dots. In this paper, new shapes of quantum dot arrays are used to enhance the photo physical properties of gold nano-particles. This paper presents a study of the effect of nano-particles shape, array, and size on their absorption characteristics.

Keywords: quantum dots, nano-particles, LSPR

Procedia PDF Downloads 475
25837 Use of Smartwatches for the Emotional Self-Regulation of Individuals with Autism Spectrum Disorder (ASD)

Authors: Juan C. Torrado, Javier Gomez, Guadalupe Montero, German Montoro, M. Dolores Villalba

Abstract:

One of the most challenging aspects of the executive dysfunction of people with Autism Spectrum Disorders is the behavior control. This is related to a deficit in their ability to regulate, recognize and manage their own emotions. Some researchers have developed applications for tablets and smartphones to practice strategies of relaxation and emotion recognition. However, they cannot be applied to the very moment of temper outbursts, anger episodes or anxiety, since they require to carry the device, start the application and be helped by caretakers. Also, some of these systems are developed for either obsolete technologies (old versions of tablet devices, PDAs, outdated operative systems of smartphones) or specific devices (self-developed or proprietary ones) that create differentiation between the users and the rest of the individuals in their context. For this project we selected smartwatches. Focusing on emergent technologies ensures a wide lifespan of the developed products, because the derived products are intended to be available in the same moment the very technology gets popularized, not later. We also focused our research in commercial versions of smartwatches, since this way differentiation is easily avoided, so the users’ abandonment rate lowers. We have developed a smartwatch system along with a smartphone authoring tool to display self-regulation strategies. These micro-prompting strategies are conformed of pictograms, animations and temporizers, and they are designed by means of the authoring tool: When both devices synchronize their data, the smartwatch holds the self-regulation strategies, which are triggered when the smartwatch sensors detect a remarkable rise of heart rate and movement. The system is being currently tested in an educational center of people with ASD of Madrid, Spain.

Keywords: assistive technologies, emotion regulation, human-computer interaction, smartwatches

Procedia PDF Downloads 289
25836 Teaching Translation during Covid-19 Outbreak: Challenges and Discoveries

Authors: Rafat Alwazna

Abstract:

Translation teaching is a particular activity that includes translators and interpreters training either inside or outside institutionalised settings, such as universities. It can also serve as a means of teaching other fields, such as foreign languages. Translation teaching began in the twentieth century. Teachers of translation hold the responsibilities of educating students, developing their translation competence and training them to be professional translators. The activity of translation teaching involves various tasks, including curriculum design, course delivery, material writing as well as application and implementation. The present paper addresses translation teaching during COVID-19 outbreak, seeking to find out the challenges encountered by translation teachers in online translation teaching and the discoveries/solutions arrived at to resolve them. The paper makes use of a comprehensive questionnaire, containing closed-ended and open-ended questions to elicit both quantitative as well as qualitative data from about sixty translation teachers who have been teaching translation at BA and MA levels during COVID-19 outbreak. The data shows that about 40% of the participants evaluate their online translation teaching experience during COVID-19 outbreak as enjoyable and exhilarating. On the contrary, no participant has evaluated his/her online translation teaching experience as being not good, nor has any participant evaluated his/her online translation teaching experience as being terrible. The data also presents that about 23.33% of the participants evaluate their online translation teaching experience as very good, and the same percentage applies to those who evaluate their online translation teaching experience as good to some extent. Moreover, the data indicates that around 13.33% of the participants evaluate their online translation teaching experience as good. The data also demonstrates that the majority of the participants have encountered obstacles in online translation teaching and have concurrently proposed solutions to resolve them.

Keywords: online translation teaching, electronic learning platform, COVID-19 outbreak, challenges, solutions

Procedia PDF Downloads 217
25835 Load Forecasting Using Neural Network Integrated with Economic Dispatch Problem

Authors: Mariyam Arif, Ye Liu, Israr Ul Haq, Ahsan Ashfaq

Abstract:

High cost of fossil fuels and intensifying installations of alternate energy generation sources are intimidating main challenges in power systems. Making accurate load forecasting an important and challenging task for optimal energy planning and management at both distribution and generation side. There are many techniques to forecast load but each technique comes with its own limitation and requires data to accurately predict the forecast load. Artificial Neural Network (ANN) is one such technique to efficiently forecast the load. Comparison between two different ranges of input datasets has been applied to dynamic ANN technique using MATLAB Neural Network Toolbox. It has been observed that selection of input data on training of a network has significant effects on forecasted results. Day-wise input data forecasted the load accurately as compared to year-wise input data. The forecasted load is then distributed among the six generators by using the linear programming to get the optimal point of generation. The algorithm is then verified by comparing the results of each generator with their respective generation limits.

Keywords: artificial neural networks, demand-side management, economic dispatch, linear programming, power generation dispatch

Procedia PDF Downloads 184
25834 Graphic Procession Unit-Based Parallel Processing for Inverse Computation of Full-Field Material Properties Based on Quantitative Laser Ultrasound Visualization

Authors: Sheng-Po Tseng, Che-Hua Yang

Abstract:

Motivation and Objective: Ultrasonic guided waves become an important tool for nondestructive evaluation of structures and components. Guided waves are used for the purpose of identifying defects or evaluating material properties in a nondestructive way. While guided waves are applied for evaluating material properties, instead of knowing the properties directly, preliminary signals such as time domain signals or frequency domain spectra are first revealed. With the measured ultrasound data, inversion calculation can be further employed to obtain the desired mechanical properties. Methods: This research is development of high speed inversion calculation technique for obtaining full-field mechanical properties from the quantitative laser ultrasound visualization system (QLUVS). The quantitative laser ultrasound visualization system (QLUVS) employs a mirror-controlled scanning pulsed laser to generate guided acoustic waves traveling in a two-dimensional target. Guided waves are detected with a piezoelectric transducer located at a fixed location. With a gyro-scanning of the generation source, the QLUVS has the advantage of fast, full-field, and quantitative inspection. Results and Discussions: This research introduces two important tools to improve the computation efficiency. Firstly, graphic procession unit (GPU) with large amount of cores are introduced. Furthermore, combining the CPU and GPU cores, parallel procession scheme is developed for the inversion of full-field mechanical properties based on the QLUVS data. The newly developed inversion scheme is applied to investigate the computation efficiency for single-layered and double-layered plate-like samples. The computation efficiency is shown to be 80 times faster than unparalleled computation scheme. Conclusions: This research demonstrates a high-speed inversion technique for the characterization of full-field material properties based on quantitative laser ultrasound visualization system. Significant computation efficiency is shown, however not reaching the limit yet. Further improvement can be reached by improving the parallel computation. Utilizing the development of the full-field mechanical property inspection technology, full-field mechanical property measured by non-destructive, high-speed and high-precision measurements can be obtained in qualitative and quantitative results. The developed high speed computation scheme is ready for applications where full-field mechanical properties are needed in a nondestructive and nearly real-time way.

Keywords: guided waves, material characterization, nondestructive evaluation, parallel processing

Procedia PDF Downloads 195
25833 Gnss Aided Photogrammetry for Digital Mapping

Authors: Muhammad Usman Akram

Abstract:

This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.

Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry

Procedia PDF Downloads 15
25832 Problems and Challenges in Social Economic Research after COVID-19: The Case Study of Province Sindh

Authors: Waleed Baloch

Abstract:

This paper investigates the problems and challenges in social-economic research in the case study of the province of Sindh after the COVID-19 pandemic; the pandemic has significantly impacted various aspects of society and the economy, necessitating a thorough examination of the resulting implications. The study also investigates potential strategies and solutions to mitigate these challenges, ensuring the continuation of robust social and economic research in the region. Through an in-depth analysis of data and interviews with key stakeholders, the study reveals several significant findings. Firstly, researchers encountered difficulties in accessing primary data due to disruptions caused by the pandemic, leading to limitations in the scope and accuracy of their studies. Secondly, the study highlights the challenges faced in conducting fieldwork, such as restrictions on travel and face-to-face interactions, which impacted the ability to gather reliable data. Lastly, the research identifies the need for innovative research methodologies and digital tools to adapt to the new research landscape brought about by the pandemic. The study concludes by proposing recommendations to address these challenges, including utilizing remote data collection methods, leveraging digital technologies for data analysis, and establishing collaborations among researchers to overcome resource constraints. By addressing these issues, researchers in the social economic field can effectively navigate the post-COVID-19 research landscape, facilitating a deeper understanding of the socioeconomic impacts and facilitating evidence-based policy interventions.

Keywords: social economic, sociology, developing economies, COVID-19

Procedia PDF Downloads 56
25831 Smart Meter Incorporating UWB Technology

Authors: T. A. Khan, A. B. Khan, M. Babar, T. A. Taj, Imran Ijaz Imran

Abstract:

Smart Meter is a key element in the evolving concept of Smart Grid, which plays an important role in interaction between the consumer and the supplier. In general, the smart meter is an intelligent digital energy meter that measures the consumption of electrical energy and provides other additional services as compared to the conventional energy meters. One of the important element that makes a meter smart and different is its communication module. Smart meters usually have two way and real-time communication between the consumer and the supplier through which its transfer data and information. In this paper, Ultra Wide Band (UWB) is recommended as communication platform because of its high data-rate and presents the physical layer, which could be easily incorporated in existing Smart Meters. The physical layer is simulated in MATLAB Simulink and the results are provided.

Keywords: Ultra Wide Band (UWB), Smart Meter, MATLAB, transfer data

Procedia PDF Downloads 511
25830 Qualitative Approaches to Mindfulness Meditation Practices in Higher Education

Authors: Patrizia Barroero, Saliha Yagoubi

Abstract:

Mindfulness meditation practices in the context of higher education are becoming more and more common. Some of the reported benefits of mediation interventions and workshops include: improved focus, general well-being, diminished stress, and even increased resilience and grit. A series of workshops free to students, faculty, and staff was offered twice a week over two semesters at Hudson County Community College, New Jersey. The results of an exploratory study based on participants’ subjective reactions to these workshops will be presented. A qualitative approach was used to collect and analyze the data and a hermeneutic phenomenological perspective served as a framework for the research design and data collection and analysis. The data collected includes three recorded videos of semi-structured interviews and several written surveys submitted by volunteer participants.

Keywords: mindfulness meditation practices, stress reduction, resilience, grit, higher education success, qualitative research

Procedia PDF Downloads 72