Search results for: time domain analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40011

Search results for: time domain analysis

38901 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin is an emerging research topic that attracted researchers in the last decade. It is used in many fields, such as smart manufacturing and smart healthcare because it saves time and money. It is usually related to other technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, Human digital twin (HDT), in specific, is still a novel idea that still needs to prove its feasibility. HDT expands the idea of Digital Twin to human beings, which are living beings and different from the inanimate physical entities. The goal of this research was to create a Human digital twin that is responsible for real-time human replies automation by simulating human behavior. For this reason, clustering, supervised classification, topic extraction, and sentiment analysis were studied in this paper. The feasibility of the HDT for personal replies generation on social messaging applications was proved in this work. The overall accuracy of the proposed approach in this paper was 63% which is a very promising result that can open the way for researchers to expand the idea of HDT. This was achieved by using Random Forest for clustering the question data base and matching new questions. K-nearest neighbor was also applied for sentiment analysis.

Keywords: human digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification, clustering

Procedia PDF Downloads 76
38900 Hierarchical Piecewise Linear Representation of Time Series Data

Authors: Vineetha Bettaiah, Heggere S. Ranganath

Abstract:

This paper presents a Hierarchical Piecewise Linear Approximation (HPLA) for the representation of time series data in which the time series is treated as a curve in the time-amplitude image space. The curve is partitioned into segments by choosing perceptually important points as break points. Each segment between adjacent break points is recursively partitioned into two segments at the best point or midpoint until the error between the approximating line and the original curve becomes less than a pre-specified threshold. The HPLA representation achieves dimensionality reduction while preserving prominent local features and general shape of time series. The representation permits course-fine processing at different levels of details, allows flexible definition of similarity based on mathematical measures or general time series shape, and supports time series data mining operations including query by content, clustering and classification based on whole or subsequence similarity.

Keywords: data mining, dimensionality reduction, piecewise linear representation, time series representation

Procedia PDF Downloads 265
38899 Heroin Withdrawal, Prison and Multiple Temporalities

Authors: Ian Walmsley

Abstract:

The aim of this paper is to explore the influence of time and temporality on the experience of coming off heroin in prison. The presentation draws on qualitative data collected during a small-scale pilot study of the role of self-care in the process of coming off drugs in prison. Time and temporality emerged as a key theme in the interview transcripts. Drug dependent prisoners experience of time in prison has not been recognized in the research literature. Instead, the literature on prison time typically views prisoners as a homogenous group or tends to focus on the influence of aging and gender on prison time. Furthermore, there is a tendency in the literature on prison drug treatment and recovery to conceptualize drug dependent prisoners as passive recipients of prison healthcare, rather than active agents. In building on these gaps, this paper argues that drug dependent prisoners experience multiple temporalities which involve an interaction between the body-times of the drug dependent prisoner and the economy of time in prison. One consequence of this interaction is the feeling that they are doing, at this point in their prison sentence, double prison time. The second part of the argument is that time and temporality were a means through which they governed their withdrawing bodies. In addition, this paper will comment on the challenges of prison research in England.

Keywords: heroin withdrawal, time and temporality, prison, body

Procedia PDF Downloads 264
38898 Social Change and Cultural Sustainability in the Wake of Digital Media Revolution in South Asia

Authors: Binod C. Agrawal

Abstract:

In modern history, industrial and media merchandising in South Asia from East Asia, Europe, United States and other countries of the West is over 200 years old. Hence, continued external technology and media exposure is not a new experience in multi-lingual and multi religious South Asia which evolved cultural means to withstand structural change. In the post-World War II phase, media exposure especially of telecommunication, film, Internet, radio, print media and television have increased manifold. South Asia did not lose any time in acquiring and adopting digital media accelerated by chip revolution, computer and satellite communication. The penetration of digital media and utilization are exceptionally high though the spread has an unequal intensity, use and effects. The author argues that industrial and media products are “cultural products” apart from being “technological products”; hence their influences are most felt in the cultural domain which may lead to blunting of unique cultural specifics in the multi-cultural, multi-lingual and multi religious South Asia. Social scientists, political leaders and parents have voiced concern of “Cultural domination”, “Digital media colonization” and “Westernization”. Increased digital media access has also opened up doors of pornography and other harmful information that have sparked fresh debates and discussions about serious negative, harmful, and undesirable social effects especially among youth. Within ‘techno-social’ perspective, based on recent research studies, the paper aims to describe and analyse possible socio-economic change due to digital media penetration. Further, analysis supports the view that the ancient multi-lingual and multi-religious cultures of South Asia due to inner cultural strength may sustain without setting in a process of irreversible structural changes in South Asia.

Keywords: cultural sustainability, digital media effects, digital media impact in South Asia, social change in South Asia

Procedia PDF Downloads 338
38897 The Impacts of Digital Marketing Activities on Customers' Purchase Intention via Brand Reputation and Awareness: Empirical Study

Authors: Radwan Al Dwairi, Sara Melhem

Abstract:

Today’s billions of individuals are linked together in real-time using different types of social platforms. Despite the increasing importance of social media marketing activities in enhancing customers’ intention to purchase online; still, the majority of research has concentrated on the impact of such tools on customer satisfaction or retention and neglecting its real role in enhancing brand reputation and awareness, which in turn impact customers’ intention to purchase online. In response, this study aims to close this gap by conducting an empirical study using a qualitative approach by collecting a sample of data from 216 respondents in this domain. Results of the study reveal the significant impact of word-of-mouth, interactions, and influencers on a brand reputation, where the latter positively and significantly impacted customers’ intention to purchase via social platforms. In addition, results show the significant impact of brand reputation on enhancing customers' purchase intention.

Keywords: brand awareness, brand reputation, EWOM, influencers, interaction

Procedia PDF Downloads 82
38896 Consensus Problem of High-Order Multi-Agent Systems under Predictor-Based Algorithm

Authors: Cheng-Lin Liu, Fei Liu

Abstract:

For the multi-agent systems with agent's dynamics described by high-order integrator, and usual consensus algorithm composed of the state coordination control parts is proposed. Under communication delay, consensus algorithm in asynchronously-coupled form just can make the agents achieve a stationary consensus, and sufficient consensus condition is obtained based on frequency-domain analysis. To recover the original consensus state of the high-order agents without communication delay, besides, a predictor-based consensus algorithm is constructed via multiplying the delayed neighboring agents' states by a delay-related compensation part, and sufficient consensus condition is also obtained. Simulation illustrates the correctness of the results.

Keywords: high-order dynamic agents, communication delay, consensus, predictor-based algorithm

Procedia PDF Downloads 555
38895 A Hybrid Normalized Gradient Correlation Based Thermal Image Registration for Morphoea

Authors: L. I. Izhar, T. Stathaki, K. Howell

Abstract:

Analyzing and interpreting of thermograms have been increasingly employed in the diagnosis and monitoring of diseases thanks to its non-invasive, non-harmful nature and low cost. In this paper, a novel system is proposed to improve diagnosis and monitoring of morphoea skin disorder based on integration with the published lines of Blaschko. In the proposed system, image registration based on global and local registration methods are found inevitable. This paper presents a modified normalized gradient cross-correlation (NGC) method to reduce large geometrical differences between two multimodal images that are represented by smooth gray edge maps is proposed for the global registration approach. This method is improved further by incorporating an iterative-based normalized cross-correlation coefficient (NCC) method. It is found that by replacing the final registration part of the NGC method where translational differences are solved in the spatial Fourier domain with the NCC method performed in the spatial domain, the performance and robustness of the NGC method can be greatly improved. It is shown in this paper that the hybrid NGC method not only outperforms phase correlation (PC) method but also improved misregistration due to translation, suffered by the modified NGC method alone for thermograms with ill-defined jawline. This also demonstrates that by using the gradients of the gray edge maps and a hybrid technique, the performance of the PC based image registration method can be greatly improved.

Keywords: Blaschko’s lines, image registration, morphoea, thermal imaging

Procedia PDF Downloads 296
38894 Music in Religion Culture of the Georgian Pentecostals

Authors: Nino Naneishvili

Abstract:

The study of religious minorities and their musical culture has attracted scant academic attention in Georgia. Within wider Georgian society, it would seem that the focus of discourse to date has been on the traditional orthodox religion and its musical expression, with other forms of religious expression regarded as intrinsically less valuable. The goal of this article is to study Georgia's different religious and musical picture which, this time, is presented on the example of the Pentecostals. The first signs of the Pentecostal movement originated at the end of the 19th Century in the USA, and first appeared in Georgia as early as 1914. An ethnomusicological perspective allows the use of anthropological and sociological approaches. The basic methodology is an ethnographic method. This involved attending religious services, observation, in-depth interviews and musical material analysis. This analysis, based on a combined use of various theoretical and methodological approaches, reveals that Georgian Pentecostals, apart from polyphonic singing, are characterised by “ bi-musicality.“ This phenomenon together with Georgian three part polyphony combines vocalisation within “social polyphony.“ The concept of back stage and front stage is highlighted. Chanters also try to express national identity. In some cases however it has been observed that they abandon or conceal certain musical forms of expression which are considered central to Georgian identity. The famous hymn “Thou art a Vineyard” is a case in point. The reason given for this omission within the Georgian Pentecostal church is that within Pentecostal doctrine, God alone is the object of worship. Therefore there is no veneration of Saints as representatives of the Divine. In some cases informants denied the existence of this hymn, and others explain that the meaning conveyed to the Vineyard is that of Jesus Christ and not the Virgin Mary. Others stated that they loved Virgin Mary and were therefore free to sing this song outside church circles. The results of this study illustrates that one of the religious minorities in Georgia, the Pentecostals, are characterised by a deviation in musical thinking from Homo Polyphonicus. They actively change their form of musical worship to secondary ethno hearing – bi-musicality. This outcome is determined by both new religious thinking and the process of globalization. A significant principle behind this form of worship is the use of forms during worship which are acceptable and accessible to all. This naturally leads to the development of modern forms. Obtained material does not demonstrate a connection between traditional religious music in general. Rather, it constitutes an independent domain.

Keywords: Georgia, globalization, music, pentecostal

Procedia PDF Downloads 313
38893 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 378
38892 Effect of Preloading on Long-Term Settlement of Closed Landfills: A Numerical Analysis

Authors: Mehrnaz Alibeikloo, Hajar Share Isfahani, Hadi Khabbaz

Abstract:

In recent years, by developing cities and increasing population, reconstructing on closed landfill sites in some regions is unavoidable. Long-term settlement is one of the major concerns associated with reconstruction on landfills after closure. The purpose of this research is evaluating the effect of preloading in various patterns of height and time on long-term settlements of closed landfills. In this regard, five scenarios of surcharge from 1 to 3 m high within 3, 4.5 and 6 months of preloading time have been modeled using PLAXIS 2D software. Moreover, the numerical results have been compared to those obtained from analytical methods, and a good agreement has been achieved. The findings indicate that there is a linear relationship between settlement and surcharge height. Although, long-term settlement decreased by applying a longer and higher preloading, the time of preloading was found to be a more effective factor compared to preloading height.

Keywords: preloading, long-term settlement, landfill, PLAXIS 2D

Procedia PDF Downloads 177
38891 Non-Newtonian Fluid Flow Simulation for a Vertical Plate and a Square Cylinder Pair

Authors: Anamika Paul, Sudipto Sarkar

Abstract:

The flow behaviour of non-Newtonian fluid is quite complicated, although both the pseudoplastic (n < 1, n being the power index) and dilatant (n > 1) fluids under this category are used immensely in chemical and process industries. A limited research work is carried out for flow over a bluff body in non-Newtonian flow environment. In the present numerical simulation we control the vortices of a square cylinder by placing an upstream vertical splitter plate for pseudoplastic (n=0.8), Newtonian (n=1) and dilatant (n=1.2) fluids. The position of the upstream plate is also varied to calculate the critical distance between the plate and cylinder, below which the cylinder vortex shedding suppresses. Here the Reynolds number is considered as Re = 150 (Re = U∞a/ν, where U∞ is the free-stream velocity of the flow, a is the side of the cylinder and ν is the maximum value of kinematic viscosity of the fluid), which comes under laminar periodic vortex shedding regime. The vertical plate is having a dimension of 0.5a × 0.05a and it is placed at the cylinder centre-line. Gambit 2.2.30 is used to construct the flow domain and to impose the boundary conditions. In detail, we imposed velocity inlet (u = U∞), pressure outlet (Neumann condition), symmetry (free-slip boundary condition) at upper and lower domain. Wall boundary condition (u = v = 0) is considered both on the cylinder and the splitter plate surfaces. The unsteady 2-D Navier Stokes equations in fully conservative form are then discretized in second-order spatial and first-order temporal form. These discretized equations are then solved by Ansys Fluent 14.5 implementing SIMPLE algorithm written in finite volume method. Here, fine meshing is used surrounding the plate and cylinder. Away from the cylinder, the grids are slowly stretched out in all directions. To get an account of mesh quality, a total of 297 × 208 grid points are used for G/a = 3 (G being the gap between the plate and cylinder) in the streamwise and flow-normal directions respectively after a grid independent study. The computed mean flow quantities obtained from Newtonian flow are agreed well with the available literatures. The results are depicted with the help of instantaneous and time-averaged flow fields. Qualitative and quantitative noteworthy differences are obtained in the flow field with the changes in rheology of fluid. Also, aerodynamic forces and vortex shedding frequencies differ with the gap-ratio and power index of the fluid. We can conclude from the present simulation that fluent is capable to capture the vortex dynamics of unsteady laminar flow regime even in the non-Newtonian flow environment.

Keywords: CFD, critical gap-ratio, splitter plate, wake-wake interactions, dilatant, pseudoplastic

Procedia PDF Downloads 105
38890 Education Quality Development for Excellence Performance with Higher Education by Using COBIT 5

Authors: Kemkanit Sanyanunthana

Abstract:

The purpose of this research is to study the management system of information technology which supports the education of five private universities in Thailand, according to the case studies which have been developing their qualities and standards of management and education by service provision of information technology to support the excellence performance. The concept to connect information technology with a suitable system has been created by information technology administrators for development, as a system that can be used throughout the organizations to help reach the utmost benefits of using all resources. Hence, the researcher as a person who has been performing these duties within higher education is interested to do this research by selecting the Control Objective for Information and Related Technology 5 (COBIT 5) for the Malcolm Baldrige National Quality Award (MBNQA) of America, or the National Award which applies the concept of Total Quality Management (TQM) to the organization evaluation. Such evaluation is called the Education Criteria for Performance Excellence (EdPEx) focuses on studying and comparing education quality development for excellent performance using COBIT 5 in terms of information technology to study the problems and obstacles of the investigation process for an information technology system, which is considered as an instrument to drive all organizations to reach the excellence performance of the information technology, and to be the model of evaluation and analysis of the process to be in accordance with the strategic plans of the information technology in the universities. This research is conducted in the form of descriptive and survey research according to the case studies. The data collection were carried out by using questionnaires through the administrators working related to the information technology field, and the research documents related to the change management as the main study. The research can be concluded that the performance based on the APO domain process (ALIGN, PLAN AND ORGANISE) of the COBIT 5 standard frame, which emphasizes concordant governance and management of strategic plans for the organizations, could reach only 95%. This might be because of some restrictions such as organizational cultures; therefore, the researcher has studied and analyzed the management of information technology in universities as a whole, under the organizational structures, to reach the performance in accordance with the overall APO domain which would affect the determined strategic plans to be able to develop based on the excellence performance of information technology, and to apply the risk management system at the organizational level into every performance process which would develop the work effectiveness for the resources management of information technology to reach the utmost benefits. 

Keywords: COBIT5, APO, EdPEx Criteria, MBNQA

Procedia PDF Downloads 313
38889 Quick Covering Machine for Grain Drying Pavement

Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug

Abstract:

In sundrying, the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack, conducting partial budget, and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0 .53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.

Keywords: quick, covering machine, grain, drying pavement

Procedia PDF Downloads 358
38888 Environmental Radioactivity Analysis by a Sequential Approach

Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab

Abstract:

Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.

Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method

Procedia PDF Downloads 484
38887 Knowledge Based Behaviour Modelling and Execution in Service Robotics

Authors: Suraj Nair, Aravindkumar Vijayalingam, Alexander Perzylo, Alois Knoll

Abstract:

In the last decade robotics research and development activities have grown rapidly, especially in the domain of service robotics. Integrating service robots into human occupied spaces such as homes, offices, hospitals, etc. has become increasingly worked upon. The primary motive is to ease daily lives of humans by taking over some of the household/office chores. However, several challenges remain in systematically integrating such systems in human shared work-spaces. In addition to sensing and indoor-navigation challenges, programmability of such systems is a major hurdle due to the fact that the potential user cannot be expected to have knowledge in robotics or similar mechatronic systems. In this paper, we propose a cognitive system for service robotics which allows non-expert users to easily model system behaviour in an underspecified manner through abstract tasks and objects associated with them. The system uses domain knowledge expressed in the form of an ontology along with logical reasoning mechanisms to infer all the missing pieces of information required for executing the tasks. Furthermore, the system is also capable of recovering from failed tasks arising due to on-line disturbances by using the knowledge base and inferring alternate methods to execute the same tasks. The system is demonstrated through a coffee fetching scenario in an office environment using a mobile robot equipped with sensors and software capabilities for autonomous navigation and human-interaction through natural language.

Keywords: cognitive robotics, reasoning, service robotics, task based systems

Procedia PDF Downloads 225
38886 Studies on H2S Gas Sensing Performance of Al2O3-Doped ZnO Thick Films at Ppb Level

Authors: M. K. Deore

Abstract:

The thick films of undoped and Al2O3 doped- ZnO were prepared by screen printing technique. AR grade (99.9 % pure) Zinc Oxide powder were mixed mechanochemically in acetone medium with Aluminium Chloride (AlCl2) material in various weight percentages such as 0.5, 1, 3 and 5 wt % to obtain Al2O3 - ZnO composite. The prepared materials were sintered at 1000oC for 12h in air ambience and ball milled to ensure sufficiently fine particle size. The electrical, structural and morphological properties of the films were investigated. The X-ray diffraction analysis of pure and doped ZnO shows the polycrystalline nature. The surface morphology of the films was studied by SEM. The final composition of each film was determined by EDAX analysis. The gas response of undoped and Al2O3- doped ZnO films were studied for different gases such as CO, H2, NH3, and H2S at operating temperature ranging from 50 oC to 450 o C. The pure film shows the response to H2S gas (500ppm) at 300oC while the film doped with 3 wt.% Al2O3 gives the good response to H2S gas(ppb) at 350oC. The selectivity, response and recovery time of the sensor were measured and presented.

Keywords: thick films, ZnO-Al2O3, H2S gas, sensitivity, selectivity, response and recovery time

Procedia PDF Downloads 408
38885 Analysis in Mexico on Workers Performing Highly Repetitive Movements with Sensory Thermography in the Surface of the Wrist and Elbows

Authors: Sandra K. Enriquez, Claudia Camargo, Jesús E. Olguín, Juan A. López, German Galindo

Abstract:

Currently companies have increased the number of disorders of cumulative trauma (CTDs), these are increasing significantly due to the Highly Repetitive Movements (HRM) performed in workstations, which causes economic losses to businesses, due to temporary and permanent disabilities of workers. This analysis focuses on the prevention of disorders caused by: repeatability, duration and effort; And focuses on reducing cumulative trauma disorders such as occupational diseases using sensory thermography as a noninvasive method, the above is to evaluate the injuries could have workers to perform repetitive motions. Objectives: The aim is to define rest periods or job rotation before they generate a CTD, this sensory thermography by analyzing changes in temperature patterns on wrists and elbows when the worker is performing HRM over a period of time 2 hours and 30 minutes. Information on non-work variables such as wrist and elbow injuries, weight, gender, age, among others, and work variables such as temperature workspace, repetitiveness and duration also met. Methodology: The analysis to 4 industrial designers, 2 men and 2 women to be specific was conducted in a business in normal health for a period of 12 days, using the following time ranges: the first day for every 90 minutes continuous work were asked to rest 5 minutes, the second day for every 90 minutes of continuous work were asked to rest 10 minutes, the same to work 60 and 30 minutes straight. Each worker was tested with 6 different ranges at least twice. This analysis was performed in a controlled room temperature between 20 and 25 ° C, and a time to stabilize the temperature of the wrists and elbows than 20 minutes at the beginning and end of the analysis. Results: The range time of 90 minutes working continuous and a rest of 5 minutes of activity is where the maximum temperature (Tmax) was registered in the wrists and elbows in the office, we found the Tmax was 35.79 ° C with a difference of 2.79 ° C between the initial and final temperature of the left elbow presented at the individual 4 during the 86 minutes, in of range in 90 minutes continuously working and rested for 5 minutes of your activity. Conclusions: It is possible with this alternative technology is sensory thermography predict ranges of rotation or rest for the prevention of CTD to perform HRM work activities, obtaining with this reduce occupational disease, quotas by health agencies and increasing the quality of life of workers, taking this technology a cost-benefit acceptable in the future.

Keywords: sensory thermography, temperature, cumulative trauma disorder (CTD), highly repetitive movement (HRM)

Procedia PDF Downloads 419
38884 Software Reliability Prediction Model Analysis

Authors: Lela Mirtskhulava, Mariam Khunjgurua, Nino Lomineishvili, Koba Bakuria

Abstract:

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Keywords: exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability

Procedia PDF Downloads 449
38883 Investigating the Vehicle-Bicyclists Conflicts using LIDAR Sensor Technology at Signalized Intersections

Authors: Alireza Ansariyar, Mansoureh Jeihani

Abstract:

Light Detection and Ranging (LiDAR) sensors are capable of recording traffic data including the number of passing vehicles and bicyclists, the speed of vehicles and bicyclists, and the number of conflicts among both road users. In order to collect real-time traffic data and investigate the safety of different road users, a LiDAR sensor was installed at Cold Spring Ln – Hillen Rd intersection in Baltimore City. The frequency and severity of collected real-time conflicts were analyzed and the results highlighted that 122 conflicts were recorded over a 10-month time interval from May 2022 to February 2023. By using an innovative image-processing algorithm, a new safety Measure of Effectiveness (MOE) was proposed to recognize the critical zones for bicyclists entering each zone. Considering the trajectory of conflicts, the results of the analysis demonstrated that conflicts in the northern approach (zone N) are more frequent and severe. Additionally, sunny weather is more likely to cause severe vehicle-bike conflicts.

Keywords: LiDAR sensor, post encroachment time threshold (PET), vehicle-bike conflicts, a measure of effectiveness (MOE), weather condition

Procedia PDF Downloads 210
38882 Failure Analysis Using Rtds for a Power System Equipped with Thyristor-Controlled Series Capacitor in Korea

Authors: Chur Hee Lee, Jae in Lee, Minh Chau Diah, Jong Su Yoon, Seung Wan Kim

Abstract:

This paper deals with Real Time Digital Simulator (RTDS) analysis about effects of transmission lines failure in power system equipped with Thyristor Controlled Series Capacitance (TCSC) in Korea. The TCSC is firstly applied in Korea to compensate real power in case of 765 kV line faults. Therefore, It is important to analyze with TCSC replica using RTDS. In this test, all systems in Korea, other than those near TCSC, were abbreviated to Thevenin equivalent. The replica was tested in the case of a line failure near the TCSC, a generator failure, and a 765-kV line failure. The effects of conventional operated STATCOM, SVC and TCSC were also analyzed. The test results will be used for the actual TCSC operational impact analysis.

Keywords: failure analysis, power system, RTDS, TCSC

Procedia PDF Downloads 110
38881 The Predictive Implication of Executive Function and Language in Theory of Mind Development in Preschool Age Children

Authors: Michael Luc Andre, Célia Maintenant

Abstract:

Theory of mind is a milestone in child development which allows children to understand that others could have different mental states than theirs. Understanding the developmental stages of theory of mind in children leaded researchers on two Connected research problems. In one hand, the link between executive function and theory of mind, and on the other hand, the relationship of theory of mind and syntax processing. These two lines of research involved a great literature, full of important results, despite certain level of disagreement between researchers. For a long time, these two research perspectives continue to grow up separately despite research conclusion suggesting that the three variables should implicate same developmental period. Indeed, our goal was to study the relation between theory of mind, executive function, and language via a unique research question. It supposed that between executive function and language, one of the two variables could play a critical role in the relationship between theory of mind and the other variable. Thus, 112 children aged between three and six years old were recruited for completing a receptive and an expressive vocabulary task, a syntax understanding task, a theory of mind task, and three executive function tasks (inhibition, cognitive flexibility and working memory). The results showed significant correlations between performance on theory of mind task and performance on executive function domain tasks, except for cognitive flexibility task. We also found significant correlations between success on theory of mind task and performance in all language tasks. Multiple regression analysis justified only syntax and general abilities of language as possible predictors of theory of mind performance in our preschool age children sample. The results were discussed in the perspective of a great role of language abilities in theory of mind development. We also discussed possible reasons that could explain the non-significance of executive domains in predicting theory of mind performance, and the meaning of our results for the literature.

Keywords: child development, executive function, general language, syntax, theory of mind

Procedia PDF Downloads 50
38880 Confirmatory Factor Analysis of Smartphone Addiction Inventory (SPAI) in the Yemeni Environment

Authors: Mohammed Al-Khadher

Abstract:

Currently, we are witnessing rapid advancements in the field of information and communications technology, forcing us, as psychologists, to combat the psychological and social effects of such developments. It also drives us to continually look for the development and preparation of measurement tools compatible with the changes brought about by the digital revolution. In this context, the current study aimed to identify the factor analysis of the Smartphone Addiction Inventory (SPAI) in the Republic of Yemen. The sample consisted of (1920) university students (1136 males and 784 females) who answered the inventory, and the data was analyzed using the statistical software (AMOS V25). The factor analysis results showed a goodness-of-fit of the data five-factor model with excellent indicators, as RMSEA-(.052), CFI-(.910), GFI-(.931), AGFI-(.915), TLI-(.897), NFI-(.895), RFI-(.880), and RMR-(.032). All within the ideal range to prove the model's fit of the scale’s factor analysis. The confirmatory factor analysis results showed factor loading in (4) items on (Time Spent), (4) items on (Compulsivity), (8) items on (Daily Life Interference), (5) items on (Craving), and (3) items on (Sleep interference); and all standard values of factor loading were statistically significant at the significance level (>.001).

Keywords: smartphone addiction inventory (SPAI), confirmatory factor analysis (CFA), yemeni students, people at risk of smartphone addiction

Procedia PDF Downloads 72
38879 Synthesis of Zeolites from Bauxite and Kaolin: Effect of Synthesis Parameters on Competing Phases

Authors: Bright Kwakye-Awuah, Elizabeth Von-Kiti, Isaac Nkrumah, Baah Sefa-Ntiri, Craig D. Williams

Abstract:

Bauxite and kaolin from Ghana Bauxite Company mine site were used to synthesize zeolites. Bauxite served as the alumina source and kaolin the silica source. Synthesis variations include variation of aging time at constant crystallization time and variation of crystallization times at constant aging time. Characterization techniques such as X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive x-ray analysis (EDX) and Fourier transform infrared spectroscopy (FTIR) were employed in the characterization of the raw samples as well as the synthesized samples. The results obtained showed that the transformations that occurred and the phase of the resulting products were coordinated by the aging time, crystallization time, alkaline concentration and Si/Al ratio of the system. Zeolites A, X, Y, analcime, Sodalite, and ZK-14 were some of the phases achieved. Zeolite LTA was achieved with short crystallization times of 3, 5, 18 and 24 hours and a maximum aging of 24 hours. Zeolite LSX was synthesized with 24 hr aging followed with 24 hr hydrothermal treatment whilst zeolite Y crystallized after 48 hr of aging and 24 hr crystallization. Prolonged crystallization time produced a mixed phased product. Prolonged aging times, on the other hand, did not yield any zeolite as the sample was amorphous. Increasing the alkaline content of the reaction mixture above 5M introduced sodalite phase in the final product. The properties of the final products were comparable to zeolites synthesized from pure chemical reagents.

Keywords: bauxite, kaolin, aging, crystallization, zeolites

Procedia PDF Downloads 206
38878 Entropy Generation Minimization in a Porous Pipe Heat Exchanger under Magnetohydrodynamics Using Cattaneo-Christov Heat Flux

Authors: Saima Ijaz, Muhammad Mushtaq, Sufian Munawar

Abstract:

This article is devoted to studying the second law analysis of the Cattaneo-Christov heat flux for non-Newtonian fluid on a moving porous pipe intensification of the magnetic field and heat source/sink. The non-Newtonian fluid is considered to have Maxwell-fluid characteristics. The Cattaneo-Christov model takes into account the specific relaxation time for heat transfer. The main causes that are responsible for creating entropy generation are viscous dissipation, heat transfer, and joule heating. An analytical method, the Homotopy Analysis Method (HAM), is utilized to solve the non-linear governing equations of the underlying model. Mathematical results are shown with graphs and tables. In this work, all those parameters are sorted out which are responsible for an increase or decrease in entropy generation. Namely, the porosity, magnetic field effects, and heat source/sink rate are in the former category, and Cattaneo-Christov relaxation time is in the latter one. These results are new contributions in the case of internal flow in the pipe and would be helpful for reducing the entropy generation strategies.

Keywords: Cattaneo-Christov heat flux, entropy generation analysis, heat source / sink, joule heating, non-newtonian fluid, porous pipe

Procedia PDF Downloads 16
38877 Application of Lean Six Sigma Tools to Minimize Time and Cost in Furniture Packaging

Authors: Suleiman Obeidat, Nabeel Mandahawi

Abstract:

In this work, the packaging process for a move is improved. The customers of this move need their household stuff to be moved from their current house to the new one with minimum damage, in an organized manner, on time and with the minimum cost. Our goal was to improve the process between 10% and 20% time efficiency, 90% reduction in damaged parts and an acceptable improvement in the cost of the total move process. The expected ROI was 833%. Many improvement techniques have been used in terms of the way the boxes are prepared, their preparation cost, packing the goods, labeling them and moving them to a place for moving out. DMAIC technique is used in this work: SIPOC diagram, value stream map of “As Is” process, Root Cause Analysis, Maps of “Future State” and “Ideal State” and an Improvement Plan. A value of ROI=624% is obtained which is lower than the expected value of 833%. The work explains the techniques of improvement and the deficiencies in the old process.

Keywords: packaging, lean tools, six sigma, DMAIC methodology, SIPOC

Procedia PDF Downloads 413
38876 Electromagnetic Wave Propagation Equations in 2D by Finite Difference Method

Authors: N. Fusun Oyman Serteller

Abstract:

In this paper, the techniques to solve time dependent electromagnetic wave propagation equations based on the Finite Difference Method (FDM) are proposed by comparing the results with Finite Element Method (FEM) in 2D while discussing some special simulation examples.  Here, 2D dynamical wave equations for lossy media, even with a constant source, are discussed for establishing symbolic manipulation of wave propagation problems. The main objective of this contribution is to introduce a comparative study of two suitable numerical methods and to show that both methods can be applied effectively and efficiently to all types of wave propagation problems, both linear and nonlinear cases, by using symbolic computation. However, the results show that the FDM is more appropriate for solving the nonlinear cases in the symbolic solution. Furthermore, some specific complex domain examples of the comparison of electromagnetic waves equations are considered. Calculations are performed through Mathematica software by making some useful contribution to the programme and leveraging symbolic evaluations of FEM and FDM.

Keywords: finite difference method, finite element method, linear-nonlinear PDEs, symbolic computation, wave propagation equations

Procedia PDF Downloads 130
38875 Enhancer: An Effective Transformer Architecture for Single Image Super Resolution

Authors: Pitigalage Chamath Chandira Peiris

Abstract:

A widely researched domain in the field of image processing in recent times has been single image super-resolution, which tries to restore a high-resolution image from a single low-resolution image. Many more single image super-resolution efforts have been completed utilizing equally traditional and deep learning methodologies, as well as a variety of other methodologies. Deep learning-based super-resolution methods, in particular, have received significant interest. As of now, the most advanced image restoration approaches are based on convolutional neural networks; nevertheless, only a few efforts have been performed using Transformers, which have demonstrated excellent performance on high-level vision tasks. The effectiveness of CNN-based algorithms in image super-resolution has been impressive. However, these methods cannot completely capture the non-local features of the data. Enhancer is a simple yet powerful Transformer-based approach for enhancing the resolution of images. A method for single image super-resolution was developed in this study, which utilized an efficient and effective transformer design. This proposed architecture makes use of a locally enhanced window transformer block to alleviate the enormous computational load associated with non-overlapping window-based self-attention. Additionally, it incorporates depth-wise convolution in the feed-forward network to enhance its ability to capture local context. This study is assessed by comparing the results obtained for popular datasets to those obtained by other techniques in the domain.

Keywords: single image super resolution, computer vision, vision transformers, image restoration

Procedia PDF Downloads 91
38874 Energy-Aware Scheduling in Real-Time Systems: An Analysis of Fair Share Scheduling and Priority-Driven Preemptive Scheduling

Authors: Su Xiaohan, Jin Chicheng, Liu Yijing, Burra Venkata Durga Kumar

Abstract:

Energy-aware scheduling in real-time systems aims to minimize energy consumption, but issues related to resource reservation and timing constraints remain challenges. This study focuses on analyzing two scheduling algorithms, Fair-Share Scheduling (FFS) and Priority-Driven Preemptive Scheduling (PDPS), for solving these issues and energy-aware scheduling in real-time systems. Based on research on both algorithms and the processes of solving two problems, it can be found that Fair-Share Scheduling ensures fair allocation of resources but needs to improve with an imbalanced system load, and Priority-Driven Preemptive Scheduling prioritizes tasks based on criticality to meet timing constraints through preemption but relies heavily on task prioritization and may not be energy efficient. Therefore, improvements to both algorithms with energy-aware features will be proposed. Future work should focus on developing hybrid scheduling techniques that minimize energy consumption through intelligent task prioritization, resource allocation, and meeting time constraints.

Keywords: energy-aware scheduling, fair-share scheduling, priority-driven preemptive scheduling, real-time systems, optimization, resource reservation, timing constraints

Procedia PDF Downloads 107
38873 Connecting Students and Faculty Research Efforts through the Research and Projects Portal

Authors: Havish Nalapareddy, Mark V. Albert, Ranak Bansal, Avi Udash, Lin Lin

Abstract:

Students engage in many course projects during their degree programs. However, impactful projects often need a time frame longer than a single semester. Ideally, projects are documented and structured to be readily accessible to future students who may choose to continue the project, with features that emphasize the local community, university, or course structure. The Research and Project Portal (RAPP) is a place where students can post both their completed and ongoing projects with all the resources and tools used. This portal allows students to see what other students have done in the past, in the same university environment, related to their domain of interest. Computer science instructors or students selecting projects can use this portal to assign or choose an incomplete project. Additionally, this portal allows non-computer science faculty and industry collaborators to document their project ideas for students in courses to prototype directly, rather than directly soliciting the help of instructors in engaging students. RAPP serves as a platform linking students across classes and faculty both in and out of computer science courses on joint projects to encourage long-term project efforts across semesters or years.

Keywords: education, technology, research, academic portal

Procedia PDF Downloads 125
38872 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0

Authors: Harris Niavis, Dimitra Politaki

Abstract:

The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.

Keywords: blockchain, data quality, industry4.0, product quality

Procedia PDF Downloads 167