Search results for: real time digital simulator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22675

Search results for: real time digital simulator

13645 Resource Sharing Issues of Distributed Systems Influences on Healthcare Sector Concurrent Environment

Authors: Soo Hong Da, Ng Zheng Yao, Burra Venkata Durga Kumar

Abstract:

The Healthcare sector is a business that consists of providing medical services, manufacturing medical equipment and drugs as well as providing medical insurance to the public. Most of the time, the data stored in the healthcare database is to be related to patient’s information which is required to be accurate when it is accessed by authorized stakeholders. In distributed systems, one important issue is concurrency in the system as it ensures the shared resources to be synchronized and remains consistent through multiple read and write operations by multiple clients. The problems of concurrency in the healthcare sector are who gets the access and how the shared data is synchronized and remains consistent when there are two or more stakeholders attempting to the shared data simultaneously. In this paper, a framework that is beneficial to distributed healthcare sector concurrent environment is proposed. In the proposed framework, four different level nodes of the database, which are national center, regional center, referral center, and local center are explained. Moreover, the frame synchronization is not symmetrical. There are two synchronization techniques, which are complete and partial synchronization operation are explained. Furthermore, when there are multiple clients accessed at the same time, synchronization types are also discussed with cases at different levels and priorities to ensure data is synchronized throughout the processes.

Keywords: resources, healthcare, concurrency, synchronization, stakeholders, database

Procedia PDF Downloads 145
13644 Jan’s Life-History: Changing Faces of Managerial Masculinities and Consequences for Health

Authors: Susanne Gustafsson

Abstract:

Life-history research is an extraordinarily fruitful method to use for social analysis and gendered health analysis in particular. Its potential is illustrated through a case study drawn from a Swedish project. It reveals an old type of masculinity that faces difficulties when carrying out two sets of demands simultaneously, as a worker/manager and as a father/husband. The paper illuminates the historical transformation of masculinity and the consequences of this for health. We draw on the idea of the “changing faces of masculinity” to explore the dynamism and complexity of gendered health. An empirical case is used for its illustrative abilities. Jan, a middle-level manager and father employed in the energy sector in urban Sweden is the subject of this paper. Jan’s story is one of 32 semi-structured interviews included in an extended study focusing on well-being at work. The results reveal a face of masculinity conceived of in middle-level management as tacitly linked to the neoliberal doctrine. Over a couple of decades, the idea of “flexibility” was turned into a valuable characteristic that everyone was supposed to strive for. This resulted in increased workloads. Quite a few employees, and managers, in particular, find themselves working both day and night. This may explain why not having enough time to spend with children and family members is a recurring theme in the data. Can this way of doing be linked to masculinity and health? The first author’s research has revealed that the use of gender in health science is not sufficiently or critically questioned. This lack of critical questioning is a serious problem, especially since ways of doing gender affect health. We suggest that gender reproduction and gender transformation are interconnected, regardless of how they affect health. They are recognized as two sides of the same phenomenon, and minor movements in one direction or the other become crucial for understanding its relation to health. More or less, at the same time, as Jan’s masculinity was reproduced in response to workplace practices, Jan’s family position was transformed—not totally but by a degree or two, and these degrees became significant for the family’s health and well-being. By moving back and forth between varied events in Jan’s biographical history and his sociohistorical life span, it becomes possible to show that in a time of gender transformations, power relations can be renegotiated, leading to consequences for health.

Keywords: changing faces of masculinity, gendered health, life-history research method, subverter

Procedia PDF Downloads 106
13643 Taguchi-Based Optimization of Surface Roughness and Dimensional Accuracy in Wire EDM Process with S7 Heat Treated Steel

Authors: Joseph C. Chen, Joshua Cox

Abstract:

This research focuses on the use of the Taguchi method to reduce the surface roughness and improve dimensional accuracy of parts machined by Wire Electrical Discharge Machining (EDM) with S7 heat treated steel material. Due to its high impact toughness, the material is a candidate for a wide variety of tooling applications which require high precision in dimension and desired surface roughness. This paper demonstrates that Taguchi Parameter Design methodology is able to optimize both dimensioning and surface roughness successfully by investigating seven wire-EDM controllable parameters: pulse on time (ON), pulse off time (OFF), servo voltage (SV), voltage (V), servo feed (SF), wire tension (WT), and wire speed (WS). The temperature of the water in the Wire EDM process is investigated as the noise factor in this research. Experimental design and analysis based on L18 Taguchi orthogonal arrays are conducted. This paper demonstrates that the Taguchi-based system enables the wire EDM process to produce (1) high precision parts with an average of 0.6601 inches dimension, while the desired dimension is 0.6600 inches; and (2) surface roughness of 1.7322 microns which is significantly improved from 2.8160 microns.

Keywords: Taguchi Parameter Design, surface roughness, Wire EDM, dimensional accuracy

Procedia PDF Downloads 368
13642 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling

Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai

Abstract:

Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.

Keywords: care responsibilities, home office, job satisfaction, structural equation modeling

Procedia PDF Downloads 80
13641 Understanding How to Increase Restorativeness of Interiors: A Qualitative Exploratory Study on Attention Restoration Theory in Relation to Interior Design

Authors: Hande Burcu Deniz

Abstract:

People in the U.S. spend a considerable portion of their time indoors. This makes it crucial to provide environments that support the well-being of people. Restorative environments aim to help people recover their cognitive resources that were spent due to intensive use of directed attention. Spending time in nature and taking a nap are two of the best ways to restore these resources. However, they are not possible to do most of the time. The problem is that many studies have revealed how nature and spending time in natural contexts can help boost restoration, but there are fewer studies conducted to understand how cognitive resources can be restored in interior settings. This study aims to explore the answer to this question: which qualities of interiors increase the restorativeness of an interior setting and how do they mediate restorativeness of an interior. To do this, a phenomenological qualitative study was conducted. The study was interested in the definition of attention restoration and the experiences of the phenomena. As the themes emerged, they were analyzed to match with Attention Restoration Theory components (being away, extent, fascination, compatibility) to examine how interior design elements mediate the restorativeness of an interior. The data was gathered from semi-structured interviews with international residents of Minnesota. The interviewees represent young professionals who work in Minnesota and often experience mental fatigue. Also, they have less emotional connections with places in Minnesota, which enabled data to be based on the physical qualities of a space rather than emotional connections. In the interviews, participants were asked about where they prefer to be when they experience mental fatigue. Next, they were asked to describe the physical qualities of the places they prefer to be with reasons. Four themes were derived from the analysis of interviews. The themes are in order according to their frequency. The first, and most common, the theme was “connection to outside”. The analysis showed that people need to be either physically or visually connected to recover from mental fatigue. Direct connection to nature was reported as preferable, whereas urban settings were the secondary preference along with interiors. The second theme emerged from the analysis was “the presence of the artwork,” which was experienced differently by the interviewees. The third theme was “amenities”. Interviews pointed out that people prefer to have the amenities that support desired activity during recovery from mental fatigue. The last theme was “aesthetics.” Interviewees stated that they prefer places that are pleasing to their eyes. Additionally, they could not get rid of the feeling of being worn out in places that are not well-designed. When we matched the themes with the four art components (being away, extent, fascination, compatibility), some of the interior qualities showed overlapping since they were experienced differently by the interviewees. In conclusion, this study showed that interior settings have restorative potential, and they are multidimensional in their experience.

Keywords: attention restoration, fatigue, interior design, qualitative study, restorative environments

Procedia PDF Downloads 252
13640 Investigation of an Alkanethiol Modified Au Electrode as Sensor for the Antioxidant Activity of Plant Compounds

Authors: Dana A. Thal, Heike Kahlert, Fritz Scholz

Abstract:

Thiol molecules are known to easily form self-assembled monolayers (SAM) on Au surfaces. Depending on the thiol’s structure, surface modifications via SAM can be used for electrode sensor development. In the presented work, 1-decanethiol coated polycrystalline Au electrodes were applied to indirectly assess the radical scavenging potential of plant compounds and extracts. Different plant compounds with reported antioxidant properties as well as an extract from the plant Gynostemma pentaphyllum were tested for their effectiveness to prevent SAM degradation on the sensor electrodes via photolytically generated radicals in aqueous media. The SAM degradation was monitored over time by differential pulse voltammetry (DPV) measurements. The results were compared to established antioxidant assays. The obtained data showed an exposure time and concentration dependent degradation process of the SAM at the electrode’s surfaces. The tested substances differed in their capacity to prevent SAM degradation. Calculated radical scavenging activities of the tested plant compounds were different for different assays. The presented method poses a simple system for radical scavenging evaluation and, considering the importance of the test system in antioxidant activity evaluation, might be taken as a bridging tool between in-vivo and in-vitro antioxidant assay in order to obtain more biologically relevant results in antioxidant research.

Keywords: alkanethiol SAM, plant antioxidant, polycrystalline Au, radical scavenger

Procedia PDF Downloads 295
13639 The Effect of Isokinetic Fatigue of Ankle, Knee, and Hip Muscles on the Dynamic Postural Stability Index

Authors: Masoumeh Shojaei, Natalie Gedayloo, Amir Sarshin

Abstract:

The purpose of the present study was to investigate the effect of Isokinetic fatigue of muscles around the ankle, knee, and hip on the indicators of dynamic postural stability. Therefore, 15 female university students (age 19.7± 0.6 years old, weight 54.6± 9.4 kg, and height 163.9± 5.6 cm) participated in within-subjects design for 5 different days. In the first session, the postural stability indices (time to stabilization after jump-landing) without fatigue were assessed by force plate and in each next sessions, one of muscle groups of the lower limb including the muscles around ankles, knees, and hip was randomly exhausted by Biodex Isokinetic dynamometer and the indices were assessed immediately after the fatigue of each muscle group. The method involved landing on a force plate from a dynamic state, and transitioning balance into a static state. Results of ANOVA with repeated measures indicated that there was no significant difference between the time to stabilization (TTS) before and after Isokinetic fatigue of the muscles around the ankle, knee and hip in medial – lateral direction (p > 0.05), but in the anterior – posterior (AP) direction, the difference was statistically significant (p < 0.05). Least Significant Difference (LSD) post hoc test results also showed that there was significant difference between TTS in knee and hip muscles before and after isokinetic fatigue in AP direction. In the other hand knee and hip muscles group were affected by isokinetic fatigue only in AP surface (p < 0.05).

Keywords: dynamic balance, fatigue, lower limb muscles, postural control

Procedia PDF Downloads 231
13638 Overview of Multi-Chip Alternatives for 2.5 and 3D Integrated Circuit Packagings

Authors: Ching-Feng Chen, Ching-Chih Tsai

Abstract:

With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to the development of the high numerical aperture (high-NA) lithography equipment and other issues such as short channel effects. In the context of the ever-increasing technical requirements of portable devices and high-performance computing, relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (ICs) based on the updated transistor structure and technology nodes. The author concludes that multi-chip solutions for 2.5D and 3D IC packagings are feasible to prolong Moore’s Law.

Keywords: moore’s law, high numerical aperture, power consumption-performance-area-cost-cycle time to market, 2.5 and 3D- very-large-scale integration, packaging, through silicon via

Procedia PDF Downloads 112
13637 Unmet English Needs of the Non-Engineering Staff: The Case of Algerian Hydrocarbon Industry

Authors: N. Khiati

Abstract:

The present paper attempts to report on some findings that emerged out of a larger scale doctorate research into English language needs of a renowned Algerian company of Hydrocarbon industry. From a multifaceted English for specific purposes (ESP) research perspective, the paper considers the English needs of the finance/legal department staff in the midst of the conflicting needs perspectives involving both objective needs indicators (i.e., the pressure of globalised business) and the general negative attitudes among the administrative -mainly jurists- staff towards English (favouring a non-adaptation strategy). The researcher’s unearthing of the latter’s needs is an endeavour to concretise the concepts of unmet, or unconscious needs, among others. This is why, these initially uncovered hidden needs will be detailed questioning educational background, namely previous language of instruction; training experiences and expectations; as well as the actual communicative practices derived from the retrospective interviews and preliminary quantitative data of the questionnaire. Based on these rough clues suggesting real needs, the researcher will tentatively propose some implications for both pre-service and in-service training organisers as well as for educational policy makers in favour of an English course in legal English for the jurists mainly from pre-graduate phases to in-service training.

Keywords: English for specific purposes (ESP), legal and finance staff, needs analysis, unmet/unconscious needs, training implications

Procedia PDF Downloads 142
13636 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 287
13635 The Impact of Distributed Epistemologies on Software Engineering

Authors: Thomas Smith

Abstract:

Many hackers worldwide would agree that, had it not been for linear-time theory, the refinement of Byzantine fault tolerance might never have occurred. After years of significant research into extreme programming, we validate the refinement of simulated annealing. Maw, our new framework for unstable theory, is the solution to all of these issues.

Keywords: distributed, software engineering, DNS, DHCP

Procedia PDF Downloads 349
13634 Big Data Applications for Transportation Planning

Authors: Antonella Falanga, Armando Cartenì

Abstract:

"Big data" refers to extremely vast and complex sets of data, encompassing extraordinarily large and intricate datasets that require specific tools for meaningful analysis and processing. These datasets can stem from diverse origins like sensors, mobile devices, online transactions, social media platforms, and more. The utilization of big data is pivotal, offering the chance to leverage vast information for substantial advantages across diverse fields, thereby enhancing comprehension, decision-making, efficiency, and fostering innovation in various domains. Big data, distinguished by its remarkable attributes of enormous volume, high velocity, diverse variety, and significant value, represent a transformative force reshaping the industry worldwide. Their pervasive impact continues to unlock new possibilities, driving innovation and advancements in technology, decision-making processes, and societal progress in an increasingly data-centric world. The use of these technologies is becoming more widespread, facilitating and accelerating operations that were once much more complicated. In particular, big data impacts across multiple sectors such as business and commerce, healthcare and science, finance, education, geography, agriculture, media and entertainment and also mobility and logistics. Within the transportation sector, which is the focus of this study, big data applications encompass a wide variety, spanning across optimization in vehicle routing, real-time traffic management and monitoring, logistics efficiency, reduction of travel times and congestion, enhancement of the overall transportation systems, but also mitigation of pollutant emissions contributing to environmental sustainability. Meanwhile, in public administration and the development of smart cities, big data aids in improving public services, urban planning, and decision-making processes, leading to more efficient and sustainable urban environments. Access to vast data reservoirs enables deeper insights, revealing hidden patterns and facilitating more precise and timely decision-making. Additionally, advancements in cloud computing and artificial intelligence (AI) have further amplified the potential of big data, enabling more sophisticated and comprehensive analyses. Certainly, utilizing big data presents various advantages but also entails several challenges regarding data privacy and security, ensuring data quality, managing and storing large volumes of data effectively, integrating data from diverse sources, the need for specialized skills to interpret analysis results, ethical considerations in data use, and evaluating costs against benefits. Addressing these difficulties requires well-structured strategies and policies to balance the benefits of big data with privacy, security, and efficient data management concerns. Building upon these premises, the current research investigates the efficacy and influence of big data by conducting an overview of the primary and recent implementations of big data in transportation systems. Overall, this research allows us to conclude that big data better provide to enhance rational decision-making for mobility choices and is imperative for adeptly planning and allocating investments in transportation infrastructures and services.

Keywords: big data, public transport, sustainable mobility, transport demand, transportation planning

Procedia PDF Downloads 55
13633 Self-Tuning Dead-Beat PD Controller for Pitch Angle Control of a Bench-Top Helicopter

Authors: H. Mansor, S.B. Mohd-Noor, N. I. Othman, N. Tazali, R. I. Boby

Abstract:

This paper presents an improved robust Proportional Derivative controller for a 3-Degree-of-Freedom (3-DOF) bench-top helicopter by using adaptive methodology. Bench-top helicopter is a laboratory scale helicopter used for experimental purposes which is widely used in teaching laboratory and research. Proportional Derivative controller has been developed for a 3-DOF bench-top helicopter by Quanser. Experiments showed that the transient response of designed PD controller has very large steady state error i.e., 50%, which is very serious. The objective of this research is to improve the performance of existing pitch angle control of PD controller on the bench-top helicopter by integration of PD controller with adaptive controller. Usually standard adaptive controller will produce zero steady state error; however response time to reach desired set point is large. Therefore, this paper proposed an adaptive with deadbeat algorithm to overcome the limitations. The output response that is fast, robust and updated online is expected. Performance comparisons have been performed between the proposed self-tuning deadbeat PD controller and standard PD controller. The efficiency of the self-tuning dead beat controller has been proven from the tests results in terms of faster settling time, zero steady state error and capability of the controller to be updated online.

Keywords: adaptive control, deadbeat control, bench-top helicopter, self-tuning control

Procedia PDF Downloads 318
13632 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform

Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee

Abstract:

This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.

Keywords: Boid Algorithm, Crowd Simulation, Mobile Platform, Newtonian Laws, Virtual Heritage

Procedia PDF Downloads 273
13631 Evaluation of the Cities Specific Characteristics in the Formation of the Safavid Period Mints

Authors: Mahmood Seyyed, Akram Salehi Heykoei, Hamidreza Safakish Kashani

Abstract:

Among the remaining resource of the past, coins considered as an authentic documents among the most important documentary sources. The coins were minted in a place that called mint. The number and position of the mints in each period reflects the amount of economic power, political security and business growth, which was always fluctuated its position with changing the political and economic condition. Considering that, trade has more growth during the Safavid period than previous ones, the mint also has been in greater importance. It seems the one hand, the growth of economic in Safavid period has a direct link with the number and places of the mints at that time and in the other hand, the mints have been formed in some places because of the specific characteristic of cities and regions. Increasing the number of mints in the north of the country due to the growth of silk trade and in the west and northwest due to the political and commercial relation with Ottoman Empire, also the characteristics such as existence of mines, located in the Silk Road and communication ways, all are the results of this investigation. Accordingly, in this article researcher tries to examine the characteristics that give priority to a city for having mint. With considering that in the various historical periods, the mints were based in the most important cities in terms of political and social, at that time, this article examines the cities specific characteristics in the formation of the mints in Safavid period.

Keywords: documentary sources, coins, mint, city, Safavid

Procedia PDF Downloads 260
13630 A Feasibility Study of Producing Biofuels from Textile Sludge by Torrefaction Technology

Authors: Hua-Shan Tai, Yu-Ting Zeng

Abstract:

In modern and industrial society, enormous amounts of sludge from various of industries are constantly produced; currently, most of the sludge are treated by landfill and incineration. However, both treatments are not ideal because of the limited land for landfill and the secondary pollution caused by incineration. Consequently, treating industrial sludge appropriately has become an urgent issue of environmental protection. In order to solve the problem of the massive sludge, this study uses textile sludge which is the major source of waste sludge in Taiwan as raw material for torrefaction treatments. To investigate the feasibility of producing biofuels from textile sludge by torrefaction, the experiments were conducted with temperatures at 150, 200, 250, 300, and 350°C, with heating rates of 15, 20, 25 and 30°C/min, and with residence time of 30 and 60 minutes. The results revealed that the mass yields after torrefaction were approximately in the range of 54.9 to 93.4%. The energy densification ratios were approximately in the range of 0.84 to 1.10, and the energy yields were approximately in the range of 45.9 to 98.3%. The volumetric densities were approximately in the range of 0.78 to 1.14, and the volumetric energy densities were approximately in the range of 0.65 to 1.18. To sum up, the optimum energy yield (98.3%) can be reached with terminal temperature at 150 °C, heating rate of 20°C/min, and residence time of 30 minutes, and the mass yield, energy densification ratio as well as volumetric energy density were 92.2%, 1.07, and 1.15, respectively. These results indicated that the solid products after torrefaction are easy to preserve, which not only enhance the quality of the product, but also achieve the purpose of developing the material into fuel.

Keywords: biofuel, biomass energy, textile sludge, torrefaction

Procedia PDF Downloads 317
13629 Production and Characterization of Ce3+: Si2N2O Phosphors for White Light-Emitting Diodes

Authors: Alparslan A. Balta, Hilmi Yurdakul, Orkun Tunckan, Servet Turan, Arife Yurdakul

Abstract:

Si2N2O (Sinoite) is an inorganic-based oxynitride material that reveals promising phosphor candidates for white light-emitting diodes (WLEDs). However, there is now limited knowledge to explain the synthesis of Si2N2O for this purpose. Here, to the best of authors’ knowledge, we report the first time the production of Si2N2O based phosphors by CeO2, SiO2, Si3N4 from main starting powders, and Li2O sintering additive through spark plasma sintering (SPS) route. The processing parameters, e.g., pressure, temperature, and sintering time, were optimized to reach the monophase Si2N2O containing samples. The lattice parameter, crystallite size, and amount of formation phases were characterized in detail by X-ray diffraction (XRD). Grain morphology, particle size, and distribution were analyzed by scanning and transmission electron microscopes (SEM and TEM). Cathodoluminescence (CL) in SEM and photoluminescence (PL) analyses were conducted on the samples to determine the excitation, and emission characteristics of Ce3+ activated Si2N2O. Results showed that the Si2N2O phase in a maximum 90% ratio was obtained by sintering for 15 minutes at 1650oC under 30 MPa pressure. Based on the SEM-CL and PL measurements, Ce3+: Si2N2O phosphor shows a broad emission summit between 400-700 nm that corresponds to white light. The present research was supported by TUBITAK under project number 217M667.

Keywords: cerium, oxynitride, phosphors, sinoite, Si₂N₂O

Procedia PDF Downloads 103
13628 Optimization of the Fabrication Process for Particleboards Made from Oil Palm Fronds Blended with Empty Fruit Bunch Using Response Surface Methodology

Authors: Ghazi Faisal Najmuldeen, Wahida Amat-Fadzil, Zulkafli Hassan, Jinan B. Al-Dabbagh

Abstract:

The objective of this study was to evaluate the optimum fabrication process variables to produce particleboards from oil palm fronds (OPF) particles and empty fruit bunch fiber (EFB). Response surface methodology was employed to analyse the effect of hot press temperature (150–190°C); press time (3–7 minutes) and EFB blending ratio (0–40%) on particleboards modulus of rupture, modulus of elasticity, internal bonding, water absorption and thickness swelling. A Box-Behnken experimental design was carried out to develop statistical models used for the optimisation of the fabrication process variables. All factors were found to be statistically significant on particleboards properties. The statistical analysis indicated that all models showed significant fit with experimental results. The optimum particleboards properties were obtained at optimal fabrication process condition; press temperature; 186°C, press time; 5.7 min and EFB / OPF ratio; 30.4%. Incorporating of oil palm frond and empty fruit bunch to produce particleboards has improved the particleboards properties. The OPF–EFB particleboards fabricated at optimized conditions have satisfied the ANSI A208.1–1999 specification for general purpose particleboards.

Keywords: empty fruit bunch fiber, oil palm fronds, particleboards, response surface methodology

Procedia PDF Downloads 220
13627 Reinforcement of Local Law into Government Policy to Address Conflict of Utilization of Sea among Small Fishermen

Authors: Ema Septaria, Muhammad Yamani, N. S. B. Ambarini

Abstract:

The problem begins with the imposition of fine penalties by Ipuh small fishermen for customary fishing vessels encroaching catchment area in the Ipuh, a village in Muko-Muko, Bengkulu, Indonesia. Two main reasons for that are fishermen from out of Ipuh came and fished in Ipuh water using trawl as the gear and the number of fish decrease time by time as a result of irresponsible fishing practice. Such conflict has lasted since long ago. Indonesia Governing laws do not rule the utilization of sea territory by small fishermen that when the conflict appears there is a rechtvacuum on how to solve the conflict and this leads to a chaos in society. In Ipuh itself, there has been a local law in fisheries which they still adhere up to present because they believe holding to the law will keep the fish sustain. This is an empirical legal research with socio legal approach. The results of this study show even though laws do not regulate in detail about the utilization of sea territory by small fishermen, there is an article in Fisheries Act stating fisheries activity has to put attention to local law and community participation. Furthermore, constitution governs that the land, the waters and the natural resources within shall be under the powers of the State and shall be used to the greatest benefit of the people. With the power, Government has to make a policy that reinforces what has been ruled in Ipuh local law. Besides, Bengkulu Governor has to involve Ipuh community directly in managing their fisheries to ensure the fisheries sustainability therein.

Keywords: local law, reinforcement, conflict, sea utilization, small fishermen

Procedia PDF Downloads 307
13626 Mixed Effects Models for Short-Term Load Forecasting for the Spanish Regions: Castilla-Leon, Castilla-La Mancha and Andalucia

Authors: C. Senabre, S. Valero, M. Lopez, E. Velasco, M. Sanchez

Abstract:

This paper focuses on an application of linear mixed models to short-term load forecasting. The challenge of this research is to improve a currently working model at the Spanish Transport System Operator, programmed by us, and based on linear autoregressive techniques and neural networks. The forecasting system currently forecasts each of the regions within the Spanish grid separately, even though the behavior of the load in each region is affected by the same factors in a similar way. A load forecasting system has been verified in this work by using the real data from a utility. In this research it has been used an integration of several regions into a linear mixed model as starting point to obtain the information from other regions. Firstly, the systems to learn general behaviors present in all regions, and secondly, it is identified individual deviation in each regions. The technique can be especially useful when modeling the effect of special days with scarce information from the past. The three most relevant regions of the system have been used to test the model, focusing on special day and improving the performance of both currently working models used as benchmark. A range of comparisons with different forecasting models has been conducted. The forecasting results demonstrate the superiority of the proposed methodology.

Keywords: short-term load forecasting, mixed effects models, neural networks, mixed effects models

Procedia PDF Downloads 186
13625 National Plans for Recovery and Resilience between National Recovery and EU Cohesion Objectives: Insights from European Countries

Authors: Arbolino Roberta, Boffardi Raffaele

Abstract:

Achieving the highest effectiveness for the National Plans for Recovery and Resilience (NPRR) while strengthening the objectives of cohesion and reduction of intra-EU unbalances is only possible by means of strategic, coordinated, and coherent policy planning. Therefore, the present research aims at assessing and quantifying the potential impact of NPRRs across the twenty-seven European Member States in terms of economic convergence, considering disaggregated data on industrial, construction, and service sectors. The first step of the research involves a performance analysis of the main macroeconomic indicators describing the trends of twenty-seven EU economies before the pandemic outbreak. Subsequently, in order to define the potential effect of the resources allocated, we perform an impact analysis of previous similar EU investment policies, estimating national-level sectoral elasticity associated with the expenditure of the 2007-2013 and 2014-2020 Cohesion programmes funds. These coefficients are then exploited to construct adjustment scenarios. Finally, convergence analysis is performed on the data used for constructing scenarios in order to understand whether the expenditure of funds might be useful to foster economic convergence besides driving recovery. The results of our analysis show that the allocation of resources largely mirrors the aims of the policy framework underlying the NPRR, thus reporting the largest investments in both those sectors most affected by the economic shock (services) and those considered fundamental for the digital and green transition. Notwithstanding an overall positive effect, large differences exist among European countries, while no convergence process seems to be activated or fostered by these interventions.

Keywords: NPRR, policy evaluation, cohesion policy, scenario Nalsysi

Procedia PDF Downloads 78
13624 Chatbots vs. Websites: A Comparative Analysis Measuring User Experience and Emotions in Mobile Commerce

Authors: Stephan Boehm, Julia Engel, Judith Eisser

Abstract:

During the last decade communication in the Internet transformed from a broadcast to a conversational model by supporting more interactive features, enabling user generated content and introducing social media networks. Another important trend with a significant impact on electronic commerce is a massive usage shift from desktop to mobile devices. However, a presentation of product- or service-related information accumulated on websites, micro pages or portals often remains the pivot and focal point of a customer journey. A more recent change of user behavior –especially in younger user groups and in Asia– is going along with the increasing adoption of messaging applications supporting almost real-time but asynchronous communication on mobile devices. Mobile apps of this type cannot only provide an alternative for traditional one-to-one communication on mobile devices like voice calls or short messaging service. Moreover, they can be used in mobile commerce as a new marketing and sales channel, e.g., for product promotions and direct marketing activities. This requires a new way of customer interaction compared to traditional mobile commerce activities and functionalities provided based on mobile web-sites. One option better aligned to the customer interaction in mes-saging apps are so-called chatbots. Chatbots are conversational programs or dialog systems simulating a text or voice based human interaction. They can be introduced in mobile messaging and social media apps by using rule- or artificial intelligence-based imple-mentations. In this context, a comparative analysis is conducted to examine the impact of using traditional websites or chatbots for promoting a product in an impulse purchase situation. The aim of this study is to measure the impact on the customers’ user experi-ence and emotions. The study is based on a random sample of about 60 smartphone users in the group of 20 to 30-year-olds. Participants are randomly assigned into two groups and participate in a traditional website or innovative chatbot based mobile com-merce scenario. The chatbot-based scenario is implemented by using a Wizard-of-Oz experimental approach for reasons of sim-plicity and to allow for more flexibility when simulating simple rule-based and more advanced artificial intelligence-based chatbot setups. A specific set of metrics is defined to measure and com-pare the user experience in both scenarios. It can be assumed, that users get more emotionally involved when interacting with a system simulating human communication behavior instead of browsing a mobile commerce website. For this reason, innovative face-tracking and analysis technology is used to derive feedback on the emotional status of the study participants while interacting with the website or the chatbot. This study is a work in progress. The results will provide first insights on the effects of chatbot usage on user experiences and emotions in mobile commerce environments. Based on the study findings basic requirements for a user-centered design and implementation of chatbot solutions for mobile com-merce can be derived. Moreover, first indications on situations where chatbots might be favorable in comparison to the usage of traditional website based mobile commerce can be identified.

Keywords: chatbots, emotions, mobile commerce, user experience, Wizard-of-Oz prototyping

Procedia PDF Downloads 454
13623 Determinants of Aggregate Electricity Consumption in Ghana: A Multivariate Time Series Analysis

Authors: Renata Konadu

Abstract:

In Ghana, electricity has become the main form of energy which all sectors of the economy rely on for their businesses. Therefore, as the economy grows, the demand and consumption of electricity also grow alongside due to the heavy dependence on it. However, since the supply of electricity has not increased to match the demand, there has been frequent power outages and load shedding affecting business performances. To solve this problem and advance policies to secure electricity in Ghana, it is imperative that those factors that cause consumption to increase be analysed by considering the three classes of consumers; residential, industrial and non-residential. The main argument, however, is that, export of electricity to other neighbouring countries should be included in the electricity consumption model and considered as one of the significant factors which can decrease or increase consumption. The author made use of multivariate time series data from 1980-2010 and econometric models such as Ordinary Least Squares (OLS) and Vector Error Correction Model. Findings show that GDP growth, urban population growth, electricity exports and industry value added to GDP were cointegrated. The results also showed that there is unidirectional causality from electricity export and GDP growth and Industry value added to GDP to electricity consumption in the long run. However, in the short run, there was found to be a directional causality among all the variables and electricity consumption. The results have useful implication for energy policy makers especially with regards to electricity consumption, demand, and supply.

Keywords: electricity consumption, energy policy, GDP growth, vector error correction model

Procedia PDF Downloads 430
13622 Tempo-Spatial Pattern of Progress and Disparity in Child Health in Uttar Pradesh, India

Authors: Gudakesh Yadav

Abstract:

Uttar Pradesh is one of the poorest performing states of India in terms of child health. Using data from the three round of NFHS and two rounds of DLHS, this paper attempts to examine tempo-spatial change in child health and care practices in Uttar Pradesh and its regions. Rate-ratio, CI, multivariate, and decomposition analysis has been used for the study. Findings demonstrate that child health care practices have improved over the time in all regions of the state. However; western and southern region registered the lowest progress in child immunization. Nevertheless, there is no decline in prevalence of diarrhea and ARI over the period, and it remains critically high in the western and southern region. These regions also poorly performed in giving ORS, diarrhoea and ARI treatment. Public health services are least preferred for diarrhoea and ARI treatment. Results from decomposition analysis reveal that rural area, mother’s illiteracy and wealth contributed highest to the low utilization of the child health care practices consistently over the period of time. The study calls for targeted intervention for vulnerable children to accelerate child health care service utilization. Poor performing regions should be targeted and routinely monitored on poor child health indicators.

Keywords: Acute Respiratory Infection (ARI), decomposition, diarrhea, inequality, immunization

Procedia PDF Downloads 296
13621 Optimization of Two Quality Characteristics in Injection Molding Processes via Taguchi Methodology

Authors: Joseph C. Chen, Venkata Karthik Jakka

Abstract:

The main objective of this research is to optimize tensile strength and dimensional accuracy in injection molding processes using Taguchi Parameter Design. An L16 orthogonal array (OA) is used in Taguchi experimental design with five control factors at four levels each and with non-controllable factor vibration. A total of 32 experiments were designed to obtain the optimal parameter setting for the process. The optimal parameters identified for the shrinkage are shot volume, 1.7 cubic inch (A4); mold term temperature, 130 ºF (B1); hold pressure, 3200 Psi (C4); injection speed, 0.61 inch3/sec (D2); and hold time of 14 seconds (E2). The optimal parameters identified for the tensile strength are shot volume, 1.7 cubic inch (A4); mold temperature, 160 ºF (B4); hold pressure, 3100 Psi (C3); injection speed, 0.69 inch3/sec (D4); and hold time of 14 seconds (E2). The Taguchi-based optimization framework was systematically and successfully implemented to obtain an adjusted optimal setting in this research. The mean shrinkage of the confirmation runs is 0.0031%, and the tensile strength value was found to be 3148.1 psi. Both outcomes are far better results from the baseline, and defects have been further reduced in injection molding processes.

Keywords: injection molding processes, taguchi parameter design, tensile strength, high-density polyethylene(HDPE)

Procedia PDF Downloads 192
13620 Scheduling of Cross-Docking Center: An Auction-Based Algorithm

Authors: Eldho Paul, Brijesh Paul

Abstract:

This work proposes an auction mechanism based solution methodology for the optimum scheduling of trucks in a cross-docking centre. The cross-docking centre is an important element of lean supply chain. It reduces the amount of storage and transportation costs in the distribution system compared to an ordinary warehouse. Better scheduling of trucks in a cross-docking center is the best way to reduce storage and transportation costs. Auction mechanism is commonly used for allocation of limited resources in different real-life applications. Here, we try to schedule inbound trucks by integrating auction mechanism with the functioning of a cross-docking centre. A mathematical model is developed for the optimal scheduling of inbound trucks based on the auction methodology. The determination of exact solution for problems involving large number of trucks was found to be computationally difficult, and hence a genetic algorithm based heuristic methodology is proposed in this work. A comparative study of exact and heuristic solutions is done using five classes of data sets. It is observed from the study that the auction-based mechanism is capable of providing good solutions to scheduling problem in cross-docking centres.

Keywords: auction mechanism, cross-docking centre, genetic algorithm, scheduling of trucks

Procedia PDF Downloads 409
13619 A 1H NMR-Linked PCR Modelling Strategy for Tracking the Fatty Acid Sources of Aldehydic Lipid Oxidation Products in Culinary Oils Exposed to Simulated Shallow-Frying Episodes

Authors: Martin Grootveld, Benita Percival, Sarah Moumtaz, Kerry L. Grootveld

Abstract:

Objectives/Hypotheses: The adverse health effect potential of dietary lipid oxidation products (LOPs) has evoked much clinical interest. Therefore, we employed a 1H NMR-linked Principal Component Regression (PCR) chemometrics modelling strategy to explore relationships between data matrices comprising (1) aldehydic LOP concentrations generated in culinary oils/fats when exposed to laboratory-simulated shallow frying practices, and (2) the prior saturated (SFA), monounsaturated (MUFA) and polyunsaturated fatty acid (PUFA) contents of such frying media (FM), together with their heating time-points at a standard frying temperature (180 oC). Methods: Corn, sunflower, extra virgin olive, rapeseed, linseed, canola, coconut and MUFA-rich algae frying oils, together with butter and lard, were heated according to laboratory-simulated shallow-frying episodes at 180 oC, and FM samples were collected at time-points of 0, 5, 10, 20, 30, 60, and 90 min. (n = 6 replicates per sample). Aldehydes were determined by 1H NMR analysis (Bruker AV 400 MHz spectrometer). The first (dependent output variable) PCR data matrix comprised aldehyde concentration scores vectors (PC1* and PC2*), whilst the second (predictor) one incorporated those from the fatty acid content/heating time variables (PC1-PC4) and their first-order interactions. Results: Structurally complex trans,trans- and cis,trans-alka-2,4-dienals, 4,5-epxy-trans-2-alkenals and 4-hydroxy-/4-hydroperoxy-trans-2-alkenals (group I aldehydes predominantly arising from PUFA peroxidation) strongly and positively loaded on PC1*, whereas n-alkanals and trans-2-alkenals (group II aldehydes derived from both MUFA and PUFA hydroperoxides) strongly and positively loaded on PC2*. PCR analysis of these scores vectors (SVs) demonstrated that PCs 1 (positively-loaded linoleoylglycerols and [linoleoylglycerol]:[SFA] content ratio), 2 (positively-loaded oleoylglycerols and negatively-loaded SFAs), 3 (positively-loaded linolenoylglycerols and [PUFA]:[SFA] content ratios), and 4 (exclusively orthogonal sampling time-points) all powerfully contributed to aldehydic PC1* SVs (p 10-3 to < 10-9), as did all PC1-3 x PC4 interaction ones (p 10-5 to < 10-9). PC2* was also markedly dependent on all the above PC SVs (PC2 > PC1 and PC3), and the interactions of PC1 and PC2 with PC4 (p < 10-9 in each case), but not the PC3 x PC4 contribution. Conclusions: NMR-linked PCR analysis is a valuable strategy for (1) modelling the generation of aldehydic LOPs in heated cooking oils and other FM, and (2) tracking their unsaturated fatty acid (UFA) triacylglycerol sources therein.

Keywords: frying oils, lipid oxidation products, frying episodes, chemometrics, principal component regression, NMR Analysis, cytotoxic/genotoxic aldehydes

Procedia PDF Downloads 166
13618 Segmentation of Liver Using Random Forest Classifier

Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir

Abstract:

Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.

Keywords: CT images, image validation, random forest, segmentation

Procedia PDF Downloads 308
13617 Estimation and Comparison of Delay at Signalized Intersections Based on Existing Methods

Authors: Arpita Saha, Satish Chandra, Indrajit Ghosh

Abstract:

Delay implicates the time loss of a traveler while crossing an intersection. Efficiency of traffic operation at signalized intersections is assessed in terms of delay caused to an individual vehicle. Highway Capacity Manual (HCM) method and Webster’s method are the most widely used in India for delay estimation purpose. However, in India, traffic is highly heterogeneous in nature with extremely poor lane discipline. Therefore, to explore best delay estimation technique for Indian condition, a comparison was made. In this study, seven signalized intersections from three different cities where chosen. Data was collected for both during morning and evening peak hours. Only under saturated cycles were considered for this study. Delay was estimated based on the field data. With the help of Simpson’s 1/3 rd rule, delay of under saturated cycles was estimated by measuring the area under the curve of queue length and cycle time. Moreover, the field observed delay was compared with the delay estimated using HCM, Webster, Probabilistic, Taylor’s expansion and Regression methods. The drawbacks of the existing delay estimation methods to be use in Indian heterogeneous traffic conditions were figured out, and best method was proposed. It was observed that direct estimation of delay using field measured data is more accurate than existing conventional and modified methods.

Keywords: delay estimation technique, field delay, heterogeneous traffic, signalised intersection

Procedia PDF Downloads 296
13616 Deformation Severity Prediction in Sewer Pipelines

Authors: Khalid Kaddoura, Ahmed Assad, Tarek Zayed

Abstract:

Sewer pipelines are prone to deterioration over-time. In fact, their deterioration does not follow a fixed downward pattern. This is in fact due to the defects that propagate through their service life. Sewer pipeline defects are categorized into distinct groups. However, the main two groups are the structural and operational defects. By definition, the structural defects influence the structural integrity of the sewer pipelines such as deformation, cracks, fractures, holes, etc. However, the operational defects are the ones that affect the flow of the sewer medium in the pipelines such as: roots, debris, attached deposits, infiltration, etc. Yet, the process for each defect to emerge follows a cause and effect relationship. Deformation, which is the change of the sewer pipeline geometry, is one type of an influencing defect that could be found in many sewer pipelines due to many surrounding factors. This defect could lead to collapse if the percentage exceeds 15%. Therefore, it is essential to predict the deformation percentage before confronting such a situation. Accordingly, this study will predict the percentage of the deformation defect in sewer pipelines adopting the multiple regression analysis. Several factors will be considered in establishing the model, which are expected to influence the defamation defect severity. Besides, this study will construct a time-based curve to understand how the defect would evolve overtime. Thus, this study is expected to be an asset for decision-makers as it will provide informative conclusions about the deformation defect severity. As a result, inspections will be minimized and so the budgets.

Keywords: deformation, prediction, regression analysis, sewer pipelines

Procedia PDF Downloads 182