Search results for: sustainability performance assessment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19152

Search results for: sustainability performance assessment

2682 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 105
2681 Project Time and Quality Management during Construction

Authors: Nahed Al-Hajeri

Abstract:

Time and cost is an integral part of every construction plan and can affect each party’s contractual obligations. The performance of both time and cost are usually important to the client and contractor during the project. Almost all construction projects are experiencing time overrun. These time overruns always contributed as expensive to both client and contractor. Construction of any project inside the gathering centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. It also involves many agencies interdependent on each other like the vendors, structural and functional designers including various types of specialized engineers and it includes support of contractors and specialized contractors. This paper mainly highlights the types of construction delays due to which project suffer time and cost overrun. This paper also speaks about the delay causes and factors that contribute to the construction sequence delay for the oil and gas projects. Construction delay is supposed to be one of the repeated problems in the construction projects and it has an opposing effect on project success in terms of time, cost and quality. Some effective methods are identified to minimize delays in construction projects such as: 1. Site management and supervision, 2. Effective strategic planning, 3. Clear information and communication channel. Our research paper studies the types of delay with some real examples with statistic results and suggests solutions to overcome this problem.

Keywords: non-compensable delay, delays caused by force majeure, compensable delay, delays caused by the owner or the owner’s representative, non-excusable delay, delay caused by the contractor or the contractor’s representative, concurrent delay, delays resulting from two separate causes at the same time

Procedia PDF Downloads 238
2680 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad Daba, Jean-Pierre Dubois

Abstract:

Multi path fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper have utilized a Poisson modulated and weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multi-diversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent specular Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: cellular communication, femto and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process

Procedia PDF Downloads 445
2679 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method

Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi

Abstract:

This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.

Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method

Procedia PDF Downloads 390
2678 Valuing Cultural Ecosystem Services of Natural Treatment Systems Using Crowdsourced Data

Authors: Andrea Ghermandi

Abstract:

Natural treatment systems such as constructed wetlands and waste stabilization ponds are increasingly used to treat water and wastewater from a variety of sources, including stormwater and polluted surface water. The provision of ancillary benefits in the form of cultural ecosystem services makes these systems unique among water and wastewater treatment technologies and greatly contributes to determine their potential role in promoting sustainable water management practices. A quantitative analysis of these benefits, however, has been lacking in the literature. Here, a critical assessment of the recreational and educational benefits in natural treatment systems is provided, which combines observed public use from a survey of managers and operators with estimated public use as obtained using geotagged photos from social media as a proxy for visitation rates. Geographic Information Systems (GIS) are used to characterize the spatial boundaries of 273 natural treatment systems worldwide. Such boundaries are used as input for the Application Program Interfaces (APIs) of two popular photo-sharing websites (Flickr and Panoramio) in order to derive the number of photo-user-days, i.e., the number of yearly visits by individual photo users in each site. The adequateness and predictive power of four univariate calibration models using the crowdsourced data as a proxy for visitation are evaluated. A high correlation is found between photo-user-days and observed annual visitors (Pearson's r = 0.811; p-value < 0.001; N = 62). Standardized Major Axis (SMA) regression is found to outperform Ordinary Least Squares regression and count data models in terms of predictive power insofar as standard verification statistics – such as the root mean square error of prediction (RMSEP), the mean absolute error of prediction (MAEP), the reduction of error (RE), and the coefficient of efficiency (CE) – are concerned. The SMA regression model is used to estimate the intensity of public use in all 273 natural treatment systems. System type, influent water quality, and area are found to statistically affect public use, consistently with a priori expectations. Publicly available information regarding the home location of the sampled visitors is derived from their social media profiles and used to infer the distance they are willing to travel to visit the natural treatment systems in the database. Such information is analyzed using the travel cost method to derive monetary estimates of the recreational benefits of the investigated natural treatment systems. Overall, the findings confirm the opportunities arising from an integrated design and management of natural treatment systems, which combines the objectives of water quality enhancement and provision of cultural ecosystem services through public use in a multi-functional approach and compatibly with the need to protect public health.

Keywords: constructed wetlands, cultural ecosystem services, ecological engineering, waste stabilization ponds

Procedia PDF Downloads 176
2677 Modal Analysis of Functionally Graded Materials Plates Using Finite Element Method

Authors: S. J. Shahidzadeh Tabatabaei, A. M. Fattahi

Abstract:

Modal analysis of an FGM plate composed of Al2O3 ceramic phase and 304 stainless steel metal phases was performed in this paper by ABAQUS software with the assumption that the behavior of material is elastic and mechanical properties (Young's modulus and density) are variable in the thickness direction of the plate. Therefore, a sub-program was written in FORTRAN programming language and was linked with ABAQUS software. For modal analysis, a finite element analysis was carried out similar to the model of other researchers and the accuracy of results was evaluated after comparing the results. Comparison of natural frequencies and mode shapes reflected the compatibility of results and optimal performance of the program written in FORTRAN as well as high accuracy of finite element model used in this research. After validation of the results, it was evaluated the effect of material (n parameter) on the natural frequency. In this regard, finite element analysis was carried out for different values of n and in simply supported mode. About the effect of n parameter that indicates the effect of material on the natural frequency, it was observed that the natural frequency decreased as n increased; because by increasing n, the share of ceramic phase on FGM plate has decreased and the share of steel phase has increased and this led to reducing stiffness of FGM plate and thereby reduce in the natural frequency. That is because the Young's modulus of Al2O3 ceramic is equal to 380 GPa and Young's modulus of SUS304 steel is 207 GPa.

Keywords: FGM plates, modal analysis, natural frequency, finite element method

Procedia PDF Downloads 388
2676 Bilingualism Contributes to Cognitive Reserve in Parkinson's Disease

Authors: Arrate Barrenechea Garro

Abstract:

Background: Bilingualism has been shown to enhance cognitive reserve and potentially delay the onset of dementia symptoms. This study investigates the impact of bilingualism on cognitive reserve and the age of diagnosis in Parkinson's Disease (PD). Methodology: The study involves 16 non-demented monolingual PD patients and 12 non-demented bilingual PD patients, matched for age, sex, and years of education. All participants are native Spanish speakers, with Spanish as their first language (L1). Cognitive performance is assessed through a neuropsychological examination covering all cognitive domains. Cognitive reserve is measured using the Cognitive Reserve Index Questionnaire (CRIq), while language proficiency is evaluated using the Bilingual Language Profile (BLP). The age at diagnosis is recorded for both monolingual and bilingual patients. Results: Bilingual PD patients demonstrate higher scores on the CRIq compared to monolingual PD patients, with significant differences between the groups. Furthermore, there is a positive correlation between cognitive reserve (CRIq) and the utilization of the second language (L2) as indicated by the BLP. Bilingual PD patients are diagnosed, on average, three years later than monolingual PD patients. Conclusion: Bilingual PD patients exhibit higher levels of cognitive reserve compared to monolingual PD patients, as indicated by the CRIq scores. The utilization of the second language (L2) is positively correlated with cognitive reserve. Bilingual PD patients are diagnosed with PD, on average, three years later than monolingual PD patients. These findings suggest that bilingualism may contribute to cognitive reserve and potentially delay the onset of clinical symptoms associated with PD. This study adds to the existing literature supporting the relationship between bilingualism and cognitive reserve. Further research in this area could provide valuable insights into the potential protective effects of bilingualism in neurodegenerative disorders.

Keywords: bilingualis, cogntiive reserve, diagnosis, parkinson's disease

Procedia PDF Downloads 97
2675 The Digital Transformation of Life Insurance Sales in Iran With the Emergence of Personal Financial Planning Robots; Opportunities and Challenges

Authors: Pedram Saadati, Zahra Nazari

Abstract:

Anticipating and identifying future opportunities and challenges facing industry activists for the emergence and entry of new knowledge and technologies of personal financial planning, and providing practical solutions is one of the goals of this research. For this purpose, a future research tool based on receiving opinions from the main players of the insurance industry has been used. The research method in this study was in 4 stages; including 1- a survey of the specialist salesforce of life insurance in order to identify the variables 2- the ranking of the variables by experts selected by a researcher-made questionnaire 3- holding a panel of experts with the aim of understanding the mutual effects of the variables and 4- statistical analyzes of the mutual effects matrix in Mick Mac software is done. The integrated analysis of influencing variables in the future has been done with the method of Structural Analysis, which is one of the efficient and innovative methods of future research. A list of opportunities and challenges was identified through a survey of best-selling life insurance representatives who were selected by snowball sampling. In order to prioritize and identify the most important issues, all the issues raised were sent to selected experts who were selected theoretically through a researcher-made questionnaire. The respondents determined the importance of 36 variables through scoring, so that the prioritization of opportunity and challenge variables can be determined. 8 of the variables identified in the first stage were removed by selected experts, and finally, the number of variables that could be examined in the third stage became 28 variables, which, in order to facilitate the examination, were divided into 6 categories, respectively, 11 variables of organization and management. Marketing and sales 7 cases, social and cultural 6 cases, technological 2 cases, rebranding 1 case and insurance 1 case were divided. The reliability of the researcher-made questionnaire was confirmed with the Cronbach's alpha test value of 0.96. In the third stage, by forming a panel consisting of 5 insurance industry experts, the consensus of their opinions about the influence of factors on each other and the ranking of variables was entered into the matrix. The matrix included the interrelationships of 28 variables, which were investigated using the structural analysis method. By analyzing the data obtained from the matrix by Mic Mac software, the findings of the research indicate that the categories of "correct training in the use of the software, the weakness of the technology of insurance companies in personalizing products, using the approach of equipping the customer, and honesty in declaring no need Customer to Insurance", the most important challenges of the influencer and the categories of "salesforce equipping approach, product personalization based on customer needs assessment, customer's pleasant experience of being consulted with consulting robots, business improvement of the insurance company due to the use of these tools, increasing the efficiency of the issuance process and optimal customer purchase" were identified as the most important opportunities for influence.

Keywords: personal financial planning, wealth management, advisor robots, life insurance, digital transformation

Procedia PDF Downloads 44
2674 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 405
2673 Modified Side Plate Design to Suppress Lateral Torsional Buckling of H-Beam for Seismic Application

Authors: Erwin, Cheng-Cheng Chen, Charles J. Salim

Abstract:

One of the method to solve the lateral torsional buckling (LTB) problem is by using side plates to increased the buckling resistance of the beam. Some modifications in designing the side plates are made in this study to simplify the construction in the field and reduce the cost. At certain region, side plates are not added: (1) At the beam end to preserve some spaces for bolt installation, but the beam is strengthened by adding cover plate at both flanges and (2) at the middle span of the beam where the moment is smaller. Three small scale full span beam specimens are tested under cyclic loading to investigate the LTB resistant and the ductility of the proposed design method. Test results show that the LTB deformation can be effectively suppressed and very high ductility level can be achieved. Following the test, a finite element analysis (FEA) model is established and is verified using the test results. An intensive parametric study is conducted using the established FEA model. The analysis reveals that the length of side plates is the most important parameter determining the performance of the beam and the required side plates length is determined by some parameters which are (1) beam depth to flange width ratio, (2) beam slenderness ratio (3) strength and thickness of the side plates, (4) compactness of beam web and flange, and (5) beam yield strength. At the end of the paper, a design formula to calculate the required side plate length is suggested.

Keywords: cover plate, earthquake resistant design, lateral torsional buckling, side plate, steel structure

Procedia PDF Downloads 172
2672 Assessment of Very Low Birth Weight Neonatal Tracking and a High-Risk Approach to Minimize Neonatal Mortality in Bihar, India

Authors: Aritra Das, Tanmay Mahapatra, Prabir Maharana, Sridhar Srikantiah

Abstract:

In the absence of adequate well-equipped neonatal-care facilities serving rural Bihar, India, the practice of essential home-based newborn-care remains critically important for reduction of neonatal and infant mortality, especially among pre-term and small-for-gestational-age (Low-birth-weight) newborns. To improve the child health parameters in Bihar, ‘Very-Low-Birth-Weight (vLBW) Tracking’ intervention is being conducted by CARE India, since 2015, targeting public facility-delivered newborns weighing ≤2000g at birth, to improve their identification and provision of immediate post-natal care. To assess the effectiveness of the intervention, 200 public health facilities were randomly selected from all functional public-sector delivery points in Bihar and various outcomes were tracked among the neonates born there. Thus far, one pre-intervention (Feb-Apr’2015-born neonates) and three post-intervention (for Sep-Oct’2015, Sep-Oct’2016 and Sep-Oct’2017-born children) follow-up studies were conducted. In each round, interviews were conducted with the mothers/caregivers of successfully-tracked children to understand outcome, service-coverage and care-seeking during the neonatal period. Data from 171 matched facilities common across all rounds were analyzed using SAS-9.4. Identification of neonates with birth-weight ≤ 2000g improved from 2% at baseline to 3.3%-4% during post-intervention. All indicators pertaining to post-natal home-visits by frontline-workers (FLWs) improved. Significant improvements between baseline and post-intervention rounds were also noted regarding mothers being informed about ‘weak’ child – at the facility (R1 = 25 to R4 = 50%) and at home by FLW (R1 = 19%, to R4 = 30%). Practice of ‘Kangaroo-Mother-Care (KMC)’– an important component of essential newborn care – showed significant improvement in postintervention period compared to baseline in both facility (R1 = 15% to R4 = 31%) and home (R1 = 10% to R4=29%). Increasing trend was noted regarding detection and birth weight-recording of the extremely low-birth-weight newborns (< 1500 g) showed an increasing trend. Moreover, there was a downward trend in mortality across rounds, in each birth-weight strata (< 1500g, 1500-1799g and >= 1800g). After adjustment for the differential distribution of birth-weights, mortality was found to decline significantly from R1 (22.11%) to R4 (11.87%). Significantly declining trend was also observed for both early and late neonatal mortality and morbidities. Multiple regression analysis identified - birth during immediate post-intervention phase as well as that during the maintenance phase, birth weight > 1500g, children of low-parity mothers, receiving visit from FLW in the first week and/or receiving advice on extra care from FLW as predictors of survival during neonatal period among vLBW newborns. vLBW tracking was found to be a successful and sustainable intervention and has already been handed over to the Government.

Keywords: weak newborn tracking, very low birth weight babies, newborn care, community response

Procedia PDF Downloads 157
2671 A Comparative Study on Deep Learning Models for Pneumonia Detection

Authors: Hichem Sassi

Abstract:

Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.

Keywords: deep learning, computer vision, pneumonia, models, comparative study

Procedia PDF Downloads 59
2670 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 303
2669 Child Sexual Abuse Prevention: Evaluation of the Program “Sharing Mouth to Mouth: My Body, Nobody Can Touch It”

Authors: Faride Peña, Teresita Castillo, Concepción Campo

Abstract:

Sexual violence, and particularly child sexual abuse, is a serious problem all over the world, México included. Given its importance, there are several preventive and care programs done by the government and the civil society all over the country but most of them are developed in urban areas even though these problems are especially serious in rural areas. Yucatán, a state in southern México, occupies one of the first places in child sexual abuse. Considering the above, the University Unit of Clinical Research and Victimological Attention (UNIVICT) of the Autonomous University of Yucatan, designed, implemented and is currently evaluating the program named “Sharing Mouth to Mouth: My Body, Nobody Can Touch It”, a program to prevent child sexual abuse in rural communities of Yucatán, México. Its aim was to develop skills for the detection of risk situations, providing protection strategies and mechanisms for prevention through culturally relevant psycho-educative strategies to increase personal resources in children, in collaboration with parents, teachers, police and municipal authorities. The diagnosis identified that a particularly vulnerable population were children between 4 and 10 years. The program run during 2015 in primary schools in the municipality whose inhabitants are mostly Mayan. The aim of this paper is to present its evaluation in terms of its effectiveness and efficiency. This evaluation included documental analysis of the work done in the field, psycho-educational and recreational activities with children, evaluation of knowledge by participating children and interviews with parents and teachers. The results show high efficiency in fulfilling the tasks and achieving primary objectives. The efficiency shows satisfactory results but also opportunity areas that can be resolved with minor adjustments to the program. The results also show the importance of including culturally relevant strategies and activities otherwise it minimizes possible achievements. Another highlight is the importance of participatory action research in preventive approaches to child sexual abuse since by becoming aware of the importance of the subject people participate more actively; in addition to design culturally appropriate strategies and measures so that the proposal may not be distant to the people. Discussion emphasizes the methodological implications of prevention programs (convenience of using participatory action research (PAR), importance of monitoring and mediation during implementation, developing detection skills tools in creative ways using psycho-educational interactive techniques and working assessment issued by the participants themselves). As well, it is important to consider the holistic character this type of program should have, in terms of incorporating social and culturally relevant characteristics, according to the community individuality and uniqueness, consider type of communication to be used and children’ language skills considering that there should be variations strongly linked to a specific cultural context.

Keywords: child sexual abuse, evaluation, PAR, prevention

Procedia PDF Downloads 293
2668 Association of Sociodemographic Factors and Loneliness of Adolescents in China

Authors: Zihan Geng, Yifan Hou

Abstract:

Background: Loneliness is the feeling of being isolated, which is becoming increasingly common among adolescents. A cross-sectional study was performed to determine the association between loneliness and different demographics. Methods: To identify the presence of loneliness, the UCLA Loneliness Scale (Version 3) was employed. The "Questionnaire Star" in Chinese version, as the online survey on the official website, was used to distribute the self-rating questionnaires to the students in Beijing from Grade 7 to Grade 12. The questionnaire includes sociodemographic items and the UCLA Loneliness Scale. Results: Almost all of the participants exhibited “caseness” for loneliness, as defined by UCLA. Out of 266 questionnaires, 2.6% (7 in 266) students fulfilled the presence criteria for a low degree of loneliness. 29.7% (79 in 266) of adolescents met the criteria for a moderate degree of loneliness. Moreover, 62.8% (167 in 266) and 4.9% (13 in 266) of students fulfilled the presence criteria for a moderately high and high degree of loneliness, respectively. In the Pearson χ2 test, there were significant associations between loneliness and some demographic factors, including grade (P<0.001), the number of adults in the family (P=0.001), the evaluation of appearance (P=0.034), the evaluation of self-satisfaction (P<0.001), the love in family (P<0.001), academic performance (P=0.001) and emotional support from friends (P<0.001). In the multivariate logistic analysis, the number of adults (2 vs.≤1, OR=0.319, P=0.015), time spent on social media (≥4h vs. ≤1h, OR=4.862, P=0.029), emotional support of friends (more satisfied vs. dissatisfied, OR=0.363, P=0.027) were associated with loneliness. Conclusions: Our results suggest the relationship between loneliness and some sociodemographic factors, which raise the possibility to reduce the loneliness among adolescents. Therefore, the companionship of family, the encouragement from friends and regulating the time spent on social media may decrease the loneliness in adolescents.

Keywords: loneliness, adolescents, demographic factors, UCLA loneliness scale

Procedia PDF Downloads 73
2667 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 38
2666 Modeling Pronunciations of Arab Broca’s Aphasics Using Mosstalk Words Technique

Authors: Sadeq Al Yaari, Fayza Alhammadi, Ayman Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Saleh Al Yami

Abstract:

Background: There has been a debate in the literature over the years as to whether or not MossTalk Words program fits Arab Broca’s aphasics (BAs) due to that language differences and also the fact that the technique has not yet been used for aphasics with semantic dementia (SD aphasics). Aims: To oversimplify the above mentioned debate slightly for purposes of exposition, the purpose of the present study is to investigate the “usability” of this program as well as pictures and community as therapeutic techniques for both Arab BAs and SD aphasics. Method: The subjects of this study are two Saudi aphasics (53 and 57 years old, respectively). The former suffers from Broca’s aphasia due to a stroke, while the latter suffers from semantic dementia. Both aphasics can speak English and have used the Moss Talk Words program in addition to intensive picture-naming therapeutic sessions for two years. They were tested by one of the researchers four times (a time per six months). The families of the two subjects, in addition to their relatives and friends, played a major part in all therapeutic sessions. Conclusion: Results show that in averages across the entire therapeutic sessions, MossTalk Words program was clearly found more effective in modeling BAs’ pronunciation than that of SD aphasic. Furthermore, picture-naming intensive exercises in addition to the positive role of the community members played a major role in the progress of the two subjects’ performance.

Keywords: moss talk words, program, technique, Broca’s aphasia, semantic dementia, subjects, picture, community

Procedia PDF Downloads 40
2665 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 221
2664 Creative Element Analysis of Machinery Creativity Contest Works

Authors: Chin-Pin, Chen, Shi-Chi, Shiao, Ting-Hao, Lin

Abstract:

Current industry is facing the rapid development of new technology in the world and fierce changes of economic environment in the society so that the industry development trend gradually does not focus on labor, but leads the industry and the academic circle with innovation and creativity. The development trend in machinery industry presents the same situation. Based on the aim of Creativity White Paper, Ministry of Education in Taiwan promotes and develops various creativity contests to cope with the industry trend. Domestic students and enterprises have good performance on domestic and international creativity contests in recent years. There must be important creative elements in such creative works to win the award among so many works. Literature review and in-depth interview with five creativity contest awarded instructors are first proceeded to conclude 15 machinery creative elements, which are further compared with the creative elements of machinery awarded creative works in past five years to understand the relationship between awarded works and creative elements. The statistical analysis results show that IDEA (Industrial Design Excellence Award) contains the most creative elements among four major international creativity contests. That is, most creativity review focuses on creative elements that are comparatively stricter. Concerning the groups participating in creativity contests, enterprises consider more creative elements of the creative works than other two elements for contests. From such contest works, creative elements of “replacement or improvement”, “convenience”, and “modeling” present higher significance. It is expected that the above findings could provide domestic colleges and universities with reference for participating in creativity related contests in the future.

Keywords: machinery, creative elements, creativity contest, creativity works

Procedia PDF Downloads 439
2663 Architectural Design Strategies: Enhance Train Station Performance as the Catalyst of Transit Oriented Development in Jakarta, Case Study of Beos Commuter Line Station

Authors: Shinta Ardiana Sari, Dini Puti Angelia

Abstract:

A high number of urban population in Jakarta has been a substantial issue for mobility strategy. Transit Oriented Development (TOD) becomes one of the strategies to improve community livability based on the design of transit place and the system of its context. TOD principle is trying to win over pedestrian motorization habit, makes people would rather transit and travel more than using private vehicle. Train station takes the main role as the catalyst to emerge TOD, in Jakarta this role will be taken by Commuter line and the future MRT. For advancing its development, architectural design perspective is needed to perform evaluation while seeking for the strategies between accessibility transportation modes with convenience and safety for increasing human behavioral intention. This paper discovers design strategy for transit place that appropriates with Jakarta condition use the basic theories of liminal space and transit-oriented development goal. This paper use evidence-based approach with typology method to analyze the present condition of Commuter Line station in Jakarta and precedent of Asian Cities, Tokyo and Seoul, as the secondary sources, and also with numbers of valid questionnaires. Furthermore, the result of this paper aims to the emerging of a transit-oriented community by giving design requirements and suggestion transportation policies preparing for the operational of MRT in the future Jakarta and other similar cities.

Keywords: station design, transit place, transit-oriented development, urban

Procedia PDF Downloads 217
2662 Sorption Properties of Biological Waste for Lead Ions from Aqueous Solutions

Authors: Lucia Rozumová, Ivo Šafařík, Jana Seidlerová, Pavel Kůs

Abstract:

Biosorption by biological waste materials from agriculture industry could be a cost-effective technique for removing metal ions from wastewater. The performance of new biosorbent systems, consisting of the waste matrixes which were magnetically modified by iron oxide nanoparticles, for the removal of lead ions from an aqueous solution was tested. The use of low-cost and eco-friendly adsorbents has been investigated as an ideal alternative to the current expensive methods. This article deals with the removal of metal ions from aqueous solutions by modified waste products - orange peels, sawdust, peanuts husks, used tea leaves and ground coffee sediment. Magnetically modified waste materials were suspended in methanol and then was added ferrofluid (magnetic iron oxide nanoparticles). This modification process gives the predictions for the formation of the smart materials with new properties. Prepared material was characterized by using scanning electron microscopy, specific surface area and pore size analyzer. Studies were focused on the sorption and desorption properties. The changes of iron content in magnetically modified materials after treatment were observed as well. Adsorption process has been modelled by adsorption isotherms. The results show that magnetically modified materials during the dynamic sorption and desorption are stable at the high adsorbed amount of lead ions. The results of this study indicate that the biological waste materials as sorbent with new properties are highly effective for the treatment of wastewater.

Keywords: biological waste, sorption, metal ions, ferrofluid

Procedia PDF Downloads 138
2661 Study of Aging Behavior of Parallel-Series Connection Batteries

Authors: David Chao, John Lai, Alvin Wu, Carl Wang

Abstract:

For lithium-ion batteries with multiple cell configurations, some use scenarios can cause uneven aging effects to each cell within the battery because of uneven current distribution. Hence the focus of the study is to explore the aging effect(s) on batteries with different construction designs. In order to systematically study the influence of various factors in some key battery configurations, a detailed analysis of three key battery construction factors is conducted. And those key factors are (1) terminal position; (2) cell alignment matrix; and (3) interconnect resistance between cells. In this study, the 2S2P circuitry has been set as a model multi-cell battery to set up different battery samples, and the aging behavior is studied by a cycling test to analyze the current distribution and recoverable capacity. According to the outcome of aging tests, some key findings are: (I) different cells alignment matrices can have an impact on the cycle life of the battery; (II) symmetrical structure has been identified as a critical factor that can influence the battery cycle life, and unbalanced resistance can lead to inconsistent cell aging status; (III) the terminal position has been found to contribute to the uneven current distribution, that can cause an accelerated battery aging effect; and (IV) the internal connection resistance increase can actually result in cycle life increase; however, it is noteworthy that such increase in cycle life is accompanied by a decline in battery performance. In summary, the key findings from the study can help to identify the key aging factor of multi-cell batteries, and it can be useful to effectively improve the accuracy of battery capacity predictions.

Keywords: multiple cells battery, current distribution, battery aging, cell connection

Procedia PDF Downloads 75
2660 Chemical Fingerprinting of Complex Samples With the Aid of Parallel Outlet Flow Chromatography

Authors: Xavier A. Conlan

Abstract:

Speed of analysis is a significant limitation to current high-performance liquid chromatography/mass spectrometry (HPLC/MS) and ultra-high-pressure liquid chromatography (UHPLC)/MS systems both of which are used in many forensic investigations. The flow rate limitations of MS detection require a compromise in the chromatographic flow rate, which in turn reduces throughput, and when using modern columns, a reduction in separation efficiency. Commonly, this restriction is combated through the post-column splitting of flow prior to entry into the mass spectrometer. However, this results in a loss of sensitivity and a loss in efficiency due to the post-extra column dead volume. A new chromatographic column format known as 'parallel segmented flow' involves the splitting of eluent flow within the column outlet end fitting, and in this study we present its application in order to interrogate the provenience of methamphetamine samples with mass spectrometry detection. Using parallel segmented flow, column flow rates as high as 3 mL/min were employed in the analysis of amino acids without post-column splitting to the mass spectrometer. Furthermore, when parallel segmented flow chromatography columns were employed, the sensitivity was more than twice that of conventional systems with post-column splitting when the same volume of mobile phase was passed through the detector. These finding suggest that this type of column technology will particularly enhance the capabilities of modern LC/MS enabling both high-throughput and sensitive mass spectral detection.

Keywords: chromatography, mass spectrometry methamphetamine, parallel segmented outlet flow column, forensic sciences

Procedia PDF Downloads 484
2659 Design of Replication System for Computer-Generated Hologram in Optical Component Application

Authors: Chih-Hung Chen, Yih-Shyang Cheng, Yu-Hsin Tu

Abstract:

Holographic optical elements (HOEs) have recently been one of the most suitable components in optoelectronic technology owing to the requirement of the product system with compact size. Computer-generated holography (CGH) is a well-known technology for HOEs production. In some cases, a well-designed diffractive optical element with multifunctional components is also an important issue and needed for an advanced optoelectronic system. Spatial light modulator (SLM) is one of the key components that has great capability to display CGH pattern and is widely used in various applications, such as an image projection system. As mentioned to multifunctional components, such as phase and amplitude modulation of light, high-resolution hologram with multiple-exposure procedure is also one of the suitable candidates. However, holographic recording under multiple exposures, the diffraction efficiency of the final hologram is inevitably lower than that with single exposure process. In this study, a two-step holographic recording method, including the master hologram fabrication and the replicated hologram production, will be designed. Since there exist a reduction factor M² of diffraction efficiency in multiple-exposure holograms (M multiple exposures), so it seems that single exposure would be more efficient for holograms replication. In the second step of holographic replication, a stable optical system with one-shot copying is introduced. For commercial application, one may utilize this concept of holographic copying to obtain duplications of HOEs with higher optical performance.

Keywords: holographic replication, holography, one-shot copying, optical element

Procedia PDF Downloads 151
2658 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 138
2657 Impact of Stress and Protein Malnutrition on the Potential Role of Epigallocatechin-3-Gallate in Providing Protection from Nephrotoxicity and Hepatotoxicity Induced by Aluminum in Rats

Authors: Azza A. Ali, Mona G. Khalil, Hemat A. Elariny, Shereen S. El Shaer

Abstract:

Background: Aluminium (Al) is very abundant metal in the earth’s crust. It is a constituent of cooking utensils, medicines, cosmetics, some foods and food additives. Salts of Al are widely used in the treatment of drinking water for purification purposes. Excessive and prolonged exposure to Al causes oxidative stress and impairment of many physiological functions. Its accumulation in liver and kidney causes hepatotoxicity and nephrotoxicity. Social isolation (SI) or Protein malnutrition (PM) also increases oxidative stress and may enhance the toxicity of Al as well as the degeneration in liver and kidney. Epigallocatechin-3-gallate (EGCG) is the most abundant catechin in green tea and has strong antioxidant as well as anti-inflammatory activities and can protect against oxidative stress-induced degenerations. Objective: To study the influence of stress or PM on Al-induced nephrotoxicity and hepatotoxicity in rats, as well as on the potential role of EGCG in providing protection. Methods: Rats received daily AlCl3 (70 mg/kg, IP) for three weeks (Al-toxicity groups) except one normal control group received saline. Al-toxicity groups were divided into four treated and four untreated groups; treated rats received EGCG (10 mg/kg, IP) together with AlCl3. One group of both treated and untreated rats served as control for each of them, and the others were subjected to either stress (mild using isolation or high using electric shock) or to PM (10% casein diet). Specimens of liver and kidney were used for assessment of levels of inflammatory mediators as TNF-α, IL6β, nuclear factor kappa B (NF-κB), oxidative stress (MDA, SOD, TAC, NO), Caspase-3 and for DNA fragmentation as well as for histopathological examinations. Biochemical changes were also measured in the serum as total lipids, cholesterol, triglycerides, glucose, proteins, bilirubin, creatinine and urea as well as the level of Alanine aminotransferase (ALT), aspartate aminotransferase (AST), alkaline phosphatase (ALP) and lactate deshydrogenase (LDH). Results: Nephrotoxicity and hepatotoxicity induced by Al were enhanced in rats exposed to stress and to PM. The influence of stress was more pronounced than PM. Al-toxicity was indicated by the increase in liver and kidney MDA, NO, TNF-α, IL-6β, NF-κB, caspase-3, DNA fragmentation and in ALT, AST, ALP, LDH and total lipids, cholesterol, triglycerides, glucose, proteins, bilirubin, creatinine and urea levels, together with the decrease in total proteins, SOD, TAC. EGCG provided protection against hazards of Al as indicated by the decrease in MDA, NO, TNF-α, IL-6β, NF-κB, caspase-3 and DNA fragmentation as well as in levels of ALT, AST, ALP, LDH and total lipids, cholesterol, triglycerides, glucose, proteins, bilirubin, creatinine and urea in liver and kidney, together with the increase in total proteins, SOD, TAC and confirmed by histopathological examinations. It provided more pronounced protection in high stressful conditions than in mild one than in PM. Conclusion: Stress have a bad impact on Al-induced nephrotoxicity and hepatotoxicity more than PM. Thus it can clarify and maximize the role of EGCG in providing protection. Consequently, administration of EGCG is advised with excessive Al-exposure to avoid nephrotoxicity and hepatotoxicity especially in populations more subjected to stress or PM.

Keywords: aluminum, stress, protein malnutrition, nephrotoxicity, hepatotoxicity, epigallocatechin-3-gallate, rats

Procedia PDF Downloads 305
2656 Insulin Resistance in Early Postmenopausal Women Can Be Attenuated by Regular Practice of 12 Weeks of Yoga Therapy

Authors: Praveena Sinha

Abstract:

Context: Diabetes is a global public health burden, particularly affecting postmenopausal women. Insulin resistance (IR) is prevalent in this population, and it is associated with an increased risk of developing type 2 diabetes. Yoga therapy is gaining attention as a complementary intervention for diabetes due to its potential to address stress psychophysiology. This study focuses on the efficacy of a 12-week yoga practice in attenuating insulin resistance in early postmenopausal women. Research Aim: The aim of this research is to investigate the effect of a 3-month long yoga practice on insulin resistance in early postmenopausal women. Methodology: The study conducted a prospective longitudinal design with 67 women within five years of menopause. Participants were divided into two groups based on their willingness to join yoga. The Yoga group (n = 37) received routine gynecological management along with an integrated yoga module, while the Non-Yoga group (n = 30) received only routine management. Insulin resistance was measured using the homeostasis model assessment of insulin resistance (HOMA-IR) method before and after the intervention. Statistical analysis was performed using GraphPad Prism Version 5 software, with statistical significance set at P < 0.05. Findings: The results indicate a significant decrease in serum fasting insulin levels and HOMA-IR measurements in the Yoga group, although the decrease did not reach statistical significance. In contrast, the Non-Yoga group showed a significant rise in serum fasting insulin levels and HOMA-IR measurements after 3 months, suggesting a detrimental effect on insulin resistance in these postmenopausal women. Theoretical Importance: This study provides evidence that a 12-week yoga practice can attenuate the increase in insulin resistance in early postmenopausal women. It highlights the potential of yoga as a preventive measure against the early onset of insulin resistance and the development of type 2 diabetes mellitus. Regular yoga practice can be a valuable tool in addressing hormonal imbalances associated with early postmenopause, leading to a decrease in morbidity and mortality related to insulin resistance and type 2 diabetes mellitus in this population. Data Collection and Analysis Procedures: Data collection involved measuring serum fasting insulin levels and calculating HOMA-IR. Statistical analysis was performed using GraphPad Prism Version 5 software, and mean values with standard error of the mean were reported. The significance level was set at P < 0.05. Question Addressed: The study aimed to address whether a 3-month long yoga practice could attenuate insulin resistance in early postmenopausal women. Conclusion: The research findings support the efficacy of a 12-week yoga practice in attenuating insulin resistance in early postmenopausal women. Regular yoga practice has the potential to prevent the early onset of insulin resistance and the development of type 2 diabetes mellitus in this population. By addressing the hormonal imbalances associated with early post menopause, yoga could significantly decrease morbidity and mortality related to insulin resistance and type 2 diabetes mellitus in these subjects.

Keywords: post menopause, insulin resistance, HOMA-IR, yoga, type 2 diabetes mellitus

Procedia PDF Downloads 64
2655 Virtual Simulation as a Teaching Method for Community Health Nursing: An Investigation of Student Performance

Authors: Omar Mayyas

Abstract:

Clinical decision-making (CDM) is essential to community health nursing (CHN) education. For this reason, nursing educators are responsible for developing these skills among nursing students because nursing students are exposed to highly critical conditions after graduation. However, due to limited exposure to real-world situations, many nursing students need help developing clinical decision-making skills in this area. Therefore, the impact of Virtual Simulation (VS) on community health nursing students' clinical decision-making in nursing education has to be investigated. This study aims to examine the difference in CDM ability among CHN students who received traditional education compared to those who received VS classes, to identify the factors that may influence CDM ability differences between CHN students who received a traditional education and VS classes, and to provide recommendations for educational programs that can enhance the CDM ability of CHN students and improve the quality of care provided in community settings. A mixed-method study will conduct. A randomized controlled trial will compare the CDM ability of CHN students who received 1hr traditional class with another group who received 1hr VS scenario about diabetic patient nursing care. Sixty-four students in each group will randomly select to be exposed to the intervention from undergraduate nursing students who completed the CHN course at York University. The participants will receive the same Clinical Decision Making in Nursing Scale (CDMNS) questionnaire. The study intervention will follow the Medical Research Council (MRC) approach. SPSS and content analysis will use for data analysis.

Keywords: clinical decision-making, virtual simulation, community health nursing students, community health nursing education

Procedia PDF Downloads 65
2654 Clinical Application of Measurement of Eyeball Movement for Diagnose of Autism

Authors: Ippei Torii, Kaoruko Ohtani, Takahito Niwa, Naohiro Ishii

Abstract:

This paper shows developing an objectivity index using the measurement of subtle eyeball movement to diagnose autism. The developmentally disabled assessment varies, and the diagnosis depends on the subjective judgment of professionals. Therefore, a supplementary inspection method that will enable anyone to obtain the same quantitative judgment is needed. The diagnosis are made based on a comparison of the time of gazing an object in the conventional autistic study, but the results do not match. First, we divided the pupil into four parts from the center using measurements of subtle eyeball movement and comparing the number of pixels in the overlapping parts based on an afterimage. Then we developed the objective evaluation indicator to judge non-autistic and autistic people more clearly than conventional methods by analyzing the differences of subtle eyeball movements between the right and left eyes. Even when a person gazes at one point and his/her eyeballs always stay fixed at that point, their eyes perform subtle fixating movements (ie. tremors, drifting, microsaccades) to keep the retinal image clear. Particularly, the microsaccades link with nerves and reflect the mechanism that process the sight in a brain. We converted the differences between these movements into numbers. The process of the conversion is as followed: 1) Select the pixel indicating the subject's pupil from images of captured frames. 2) Set up a reference image, known as an afterimage, from the pixel indicating the subject's pupil. 3) Divide the pupil of the subject into four from the center in the acquired frame image. 4) Select the pixel in each divided part and count the number of the pixels of the overlapping part with the present pixel based on the afterimage. 5) Process the images with precision in 24 - 30fps from a camera and convert the amount of change in the pixels of the subtle movements of the right and left eyeballs in to numbers. The difference in the area of the amount of change occurs by measuring the difference between the afterimage in consecutive frames and the present frame. We set the amount of change to the quantity of the subtle eyeball movements. This method made it possible to detect a change of the eyeball vibration in numerical value. By comparing the numerical value between the right and left eyes, we found that there is a difference in how much they move. We compared the difference in these movements between non-autistc and autistic people and analyzed the result. Our research subjects consists of 8 children and 10 adults with autism, and 6 children and 18 adults with no disability. We measured the values through pasuit movements and fixations. We converted the difference in subtle movements between the right and left eyes into a graph and define it in multidimensional measure. Then we set the identification border with density function of the distribution, cumulative frequency function, and ROC curve. With this, we established an objective index to determine autism, normal, false positive, and false negative.

Keywords: subtle eyeball movement, autism, microsaccade, pursuit eye movements, ROC curve

Procedia PDF Downloads 276
2653 Investigating Kinetics and Mathematical Modeling of Batch Clarification Process for Non-Centrifugal Sugar Production

Authors: Divya Vats, Sanjay Mahajani

Abstract:

The clarification of sugarcane juice plays a pivotal role in the production of non-centrifugal sugar (NCS), profoundly influencing the quality of the final NCS product. In this study, we have investigated the kinetics and mathematical modeling of the batch clarification process. The turbidity of the clarified cane juice (NTU) emerges as the determinant of the end product’s color. Moreover, this parameter underscores the significance of considering other variables as performance indicators for accessing the efficacy of the clarification process. Temperature-controlled experiments were meticulously conducted in a laboratory-scale batch mode. The primary objective was to discern the essential and optimized parameters crucial for augmenting the clarity of cane juice. Additionally, we explored the impact of pH and flocculant loading on the kinetics. Particle Image Velocimetry (PIV) is employed to comprehend the particle-particle and fluid-particle interaction. This technique facilitated a comprehensive understanding, paving the way for the subsequent multiphase computational fluid dynamics (CFD) simulations using the Eulerian-Lagrangian approach in the Ansys fluent. Impressively, these simulations accurately replicated comparable velocity profiles. The final mechanism of this study helps to make a mathematical model and presents a valuable framework for transitioning from the traditional batch process to a continuous process. The ultimate aim is to attain heightened productivity and unwavering consistency in product quality.

Keywords: non-centrifugal sugar, particle image velocimetry, computational fluid dynamics, mathematical modeling, turbidity

Procedia PDF Downloads 69