Search results for: GHRM performance appraisal
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13103

Search results for: GHRM performance appraisal

2933 Crashworthiness Optimization of an Automotive Front Bumper in Composite Material

Authors: S. Boria

Abstract:

In the last years, the crashworthiness of an automotive body structure can be improved, since the beginning of the design stage, thanks to the development of specific optimization tools. It is well known how the finite element codes can help the designer to investigate the crashing performance of structures under dynamic impact. Therefore, by coupling nonlinear mathematical programming procedure and statistical techniques with FE simulations, it is possible to optimize the design with reduced number of analytical evaluations. In engineering applications, many optimization methods which are based on statistical techniques and utilize estimated models, called meta-models, are quickly spreading. A meta-model is an approximation of a detailed simulation model based on a dataset of input, identified by the design of experiments (DOE); the number of simulations needed to build it depends on the number of variables. Among the various types of meta-modeling techniques, Kriging method seems to be excellent in accuracy, robustness and efficiency compared to other ones when applied to crashworthiness optimization. Therefore the application of such meta-model was used in this work, in order to improve the structural optimization of a bumper for a racing car in composite material subjected to frontal impact. The specific energy absorption represents the objective function to maximize and the geometrical parameters subjected to some design constraints are the design variables. LS-DYNA codes were interfaced with LS-OPT tool in order to find the optimized solution, through the use of a domain reduction strategy. With the use of the Kriging meta-model the crashworthiness characteristic of the composite bumper was improved.

Keywords: composite material, crashworthiness, finite element analysis, optimization

Procedia PDF Downloads 257
2932 The Expression of Lipoprotein Lipase Gene with Fat Accumulations and Serum Biochemical Levels in Betong (KU Line) and Broiler Chickens

Authors: W. Loongyai, N. Saengsawang, W. Danvilai, C. Kridtayopas, P. Sopannarath, C. Bunchasak

Abstract:

Betong chicken is a slow growing and a lean strain of chicken, while the rapid growth of broiler is accompanied by increased fat. We investigated the growth performance, fat accumulations, lipid serum biochemical levels and lipoprotein lipase (LPL) gene expression of female Betong (KU line) at the age of 4 and 6 weeks. A total of 80 female Betong chickens (KU line) and 80 female broiler chickens were reared under open system (each group had 4 replicates of 20 chicks per pen). The results showed that feed intake and average daily gain (ADG) of broiler chicken were significantly higher than Betong (KU line) (P < 0.01), while feed conversion ratio (FCR) of Betong (KU line) at week 6 were significantly lower than broiler chicken (P < 0.01) at 6 weeks. At 4 and 6 weeks, two birds per replicate were randomly selected and slaughtered. Carcass weight did not significantly differ between treatments; the percentage of abdominal fat and subcutaneous fat yield was higher in the broiler (P < 0.01) at 4 and 6 week. Total cholesterol and LDL level of broiler were higher than Betong (KU line) at 4 and 6 weeks (P < 0.05). Abdominal fat samples were collected for total RNA extraction. The cDNA was amplified using primers specific for LPL gene expression and analysed using real-time PCR. The results showed that the expression of LPL gene was not different when compared between Betong (KU line) and broiler chickens at the age of 4 and 6 weeks (P > 0.05). Our results indicated that broiler chickens had high growth rate and fat accumulation when compared with Betong (KU line) chickens, whereas LPL gene expression did not differ between breeds.

Keywords: lipoprotein lipase gene, Betong (KU line), broiler, abdominal fat, gene expression

Procedia PDF Downloads 186
2931 Eco-Friendly Approach in the Management of Stored Sorghum Insect Pests in Small-Scale Farmers’ Storage Structures of Northern Nigeria

Authors: Mohammed Suleiman, Ibrahim Sani, Samaila Abubakar, Kabir Abdullahi Bindawa

Abstract:

Farmers’ storage structures in Pauwa village of Katsina State, Northern Nigeria, were simulated and incorporated with the application of leaf powders of Euphorbia balsamifera Aiton, Lawsonia inermis L., Mitracarpus hirtus (L.) DC. and Senna obtusifolia L. to search for more eco-friendly methods of managing insect pests of stored sorghum. The four most commonly grown sorghum varieties in the study area, namely “Farar Kaura” (FK), “Jar Kaura” (JK), “Yar Gidan Daudu” (YGD), and ICSV400 in threshed forms were used for the study. The four varieties (2.50 kg each) were packed in small polypropylene bags, mixed with the leaf powders at the concentration of 5% (w/w) of the plants, and kept in small stores of the aforementioned village for 12 weeks. Insect pests recovered after 12 weeks were Sitophilus zeamais, Rhyzopertha dominica, Tribolium castaneum, Cryptolestes ferrugineus, and Oryzaephilus surinamensis. There were significantly fewer insect pests in treated sorghum than in untreated types (p < 0.05). More weight losses were recorded in untreated grains than in those treated with the botanical powders. In terms of varieties, grain weight losses were in the order FK > JK > YGD > ICSV400. The botanicals also showed significant (p < 0.05) protectant ability against the weevils with their performance as E. balsamifera > L. inermis > M. hirtus > S. obtusifolia.

Keywords: botanical powders, infestations, insect pests, management, sorghum varieties, storage structures, weight losses

Procedia PDF Downloads 103
2930 Binderless Naturally-extracted Metal-free Electrocatalyst for Efficient NOₓ Reduction

Authors: Hafiz Muhammad Adeel Sharif, Tian Li, Changping Li

Abstract:

Recently, the emission of nitrogen-sulphur oxides (NOₓ, SO₂) has become a global issue and causing serious threats to health and the environment. Catalytic reduction of NOx and SOₓ gases into friendly gases is considered one of the best approaches. However, regeneration of the catalyst, higher bond-dissociation energy for NOx, i.e., 150.7 kcal/mol, escape of intermediate gas (N₂O, a greenhouse gas) with treated flue-gas, and limited activity of catalyst remains a great challenge. Here, a cheap, binderless naturally-extracted bass-wood thin carbon electrode (TCE) is presented, which shows excellent catalytic activity towards NOx reduction. The bass-wood carbonization at 900 ℃ followed by thermal activation in the presence of CO2 gas at 750 ℃. The thermal activation resulted in an increase in epoxy groups on the surface of the TCE and enhancement in the surface area as well as the degree of graphitization. The TCE unique 3D strongly inter-connected network through hierarchical micro/meso/macro pores that allow large electrode/electrolyte interface. Owing to these characteristics, the TCE exhibited excellent catalytic efficiency towards NOx (~83.3%) under ambient conditions and enhanced catalytic response under pH and sulphite exposure as well as excellent stability up to 168 hours. Moreover, a temperature-dependent activity trend was found where the highest catalytic activity was achieved at 80 ℃, beyond which the electrolyte became evaporative and resulted in a performance decrease. The designed electrocatalyst showed great potential for effective NOx-reduction, which is highly cost-effective, green, and sustainable.

Keywords: electrocatalyst, NOx-reduction, bass-wood electrode, integrated wet-scrubbing, sustainable

Procedia PDF Downloads 78
2929 Enhancement of Natural Convection Heat Transfer within Closed Enclosure Using Parallel Fins

Authors: F. A. Gdhaidh, K. Hussain, H. S. Qi

Abstract:

A numerical study of natural convection heat transfer in water filled cavity has been examined in 3D for single phase liquid cooling system by using an array of parallel plate fins mounted to one wall of a cavity. The heat generated by a heat source represents a computer CPU with dimensions of 37.5×37.5 mm mounted on substrate. A cold plate is used as a heat sink installed on the opposite vertical end of the enclosure. The air flow inside the computer case is created by an exhaust fan. A turbulent air flow is assumed and k-ε model is applied. The fins are installed on the substrate to enhance the heat transfer. The applied power energy range used is between 15- 40W. In order to determine the thermal behaviour of the cooling system, the effect of the heat input and the number of the parallel plate fins are investigated. The results illustrate that as the fin number increases the maximum heat source temperature decreases. However, when the fin number increases to critical value the temperature start to increase due to the fins are too closely spaced and that cause the obstruction of water flow. The introduction of parallel plate fins reduces the maximum heat source temperature by 10% compared to the case without fins. The cooling system maintains the maximum chip temperature at 64.68℃ when the heat input was at 40 W which is much lower than the recommended computer chips limit temperature of no more than 85℃ and hence the performance of the CPU is enhanced.

Keywords: chips limit temperature, closed enclosure, natural convection, parallel plate, single phase liquid

Procedia PDF Downloads 267
2928 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 111
2927 Project Time and Quality Management during Construction

Authors: Nahed Al-Hajeri

Abstract:

Time and cost is an integral part of every construction plan and can affect each party’s contractual obligations. The performance of both time and cost are usually important to the client and contractor during the project. Almost all construction projects are experiencing time overrun. These time overruns always contributed as expensive to both client and contractor. Construction of any project inside the gathering centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. It also involves many agencies interdependent on each other like the vendors, structural and functional designers including various types of specialized engineers and it includes support of contractors and specialized contractors. This paper mainly highlights the types of construction delays due to which project suffer time and cost overrun. This paper also speaks about the delay causes and factors that contribute to the construction sequence delay for the oil and gas projects. Construction delay is supposed to be one of the repeated problems in the construction projects and it has an opposing effect on project success in terms of time, cost and quality. Some effective methods are identified to minimize delays in construction projects such as: 1. Site management and supervision, 2. Effective strategic planning, 3. Clear information and communication channel. Our research paper studies the types of delay with some real examples with statistic results and suggests solutions to overcome this problem.

Keywords: non-compensable delay, delays caused by force majeure, compensable delay, delays caused by the owner or the owner’s representative, non-excusable delay, delay caused by the contractor or the contractor’s representative, concurrent delay, delays resulting from two separate causes at the same time

Procedia PDF Downloads 243
2926 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad Daba, Jean-Pierre Dubois

Abstract:

Multi path fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper have utilized a Poisson modulated and weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multi-diversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent specular Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: cellular communication, femto and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process

Procedia PDF Downloads 449
2925 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method

Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi

Abstract:

This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.

Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method

Procedia PDF Downloads 395
2924 Modal Analysis of Functionally Graded Materials Plates Using Finite Element Method

Authors: S. J. Shahidzadeh Tabatabaei, A. M. Fattahi

Abstract:

Modal analysis of an FGM plate composed of Al2O3 ceramic phase and 304 stainless steel metal phases was performed in this paper by ABAQUS software with the assumption that the behavior of material is elastic and mechanical properties (Young's modulus and density) are variable in the thickness direction of the plate. Therefore, a sub-program was written in FORTRAN programming language and was linked with ABAQUS software. For modal analysis, a finite element analysis was carried out similar to the model of other researchers and the accuracy of results was evaluated after comparing the results. Comparison of natural frequencies and mode shapes reflected the compatibility of results and optimal performance of the program written in FORTRAN as well as high accuracy of finite element model used in this research. After validation of the results, it was evaluated the effect of material (n parameter) on the natural frequency. In this regard, finite element analysis was carried out for different values of n and in simply supported mode. About the effect of n parameter that indicates the effect of material on the natural frequency, it was observed that the natural frequency decreased as n increased; because by increasing n, the share of ceramic phase on FGM plate has decreased and the share of steel phase has increased and this led to reducing stiffness of FGM plate and thereby reduce in the natural frequency. That is because the Young's modulus of Al2O3 ceramic is equal to 380 GPa and Young's modulus of SUS304 steel is 207 GPa.

Keywords: FGM plates, modal analysis, natural frequency, finite element method

Procedia PDF Downloads 392
2923 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 440
2922 Bilingualism Contributes to Cognitive Reserve in Parkinson's Disease

Authors: Arrate Barrenechea Garro

Abstract:

Background: Bilingualism has been shown to enhance cognitive reserve and potentially delay the onset of dementia symptoms. This study investigates the impact of bilingualism on cognitive reserve and the age of diagnosis in Parkinson's Disease (PD). Methodology: The study involves 16 non-demented monolingual PD patients and 12 non-demented bilingual PD patients, matched for age, sex, and years of education. All participants are native Spanish speakers, with Spanish as their first language (L1). Cognitive performance is assessed through a neuropsychological examination covering all cognitive domains. Cognitive reserve is measured using the Cognitive Reserve Index Questionnaire (CRIq), while language proficiency is evaluated using the Bilingual Language Profile (BLP). The age at diagnosis is recorded for both monolingual and bilingual patients. Results: Bilingual PD patients demonstrate higher scores on the CRIq compared to monolingual PD patients, with significant differences between the groups. Furthermore, there is a positive correlation between cognitive reserve (CRIq) and the utilization of the second language (L2) as indicated by the BLP. Bilingual PD patients are diagnosed, on average, three years later than monolingual PD patients. Conclusion: Bilingual PD patients exhibit higher levels of cognitive reserve compared to monolingual PD patients, as indicated by the CRIq scores. The utilization of the second language (L2) is positively correlated with cognitive reserve. Bilingual PD patients are diagnosed with PD, on average, three years later than monolingual PD patients. These findings suggest that bilingualism may contribute to cognitive reserve and potentially delay the onset of clinical symptoms associated with PD. This study adds to the existing literature supporting the relationship between bilingualism and cognitive reserve. Further research in this area could provide valuable insights into the potential protective effects of bilingualism in neurodegenerative disorders.

Keywords: bilingualis, cogntiive reserve, diagnosis, parkinson's disease

Procedia PDF Downloads 103
2921 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 413
2920 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring

Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie

Abstract:

Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.

Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement

Procedia PDF Downloads 14
2919 Evaluation of Critical Rate in Mature Oil Field with Dynamic Oil Rim Fluid Contacts in the Niger Delta

Authors: Stanley Ibuchukwu Onwukwe

Abstract:

Most reservoir in mature oil fields are vulnerable to challenges of water and/or gas coning as the size of their oil column reduces due to long period of oil production. These often result to low oil production and excessive water and/or gas production. Since over 50 years of oil production in the Niger delta, it is apparent that most of the oil fields in the region have reached their mature stages, thereby susceptible to coning tendencies. As a result of these, a good number of wells have been shut-in and abandoned, with significant amount of oil left unproduced. Analysis of the movement of fluid contacts in the reservoir is a significant aspect of reservoir studies and can assist in the management of coning tendencies and production performance of reservoirs in a mature field. This study, therefore, seeks to evaluate the occurrence of coning through the movement of fluid contacts (GOC and OWC) and determine the critical rate for controlling coning tendencies in mature oil field. This study applies the principle of Nodal analysis to calibrate the thin oil column of a reservoir of a mature field, and was graphically evaluated using the Joshi’s equation of critical rate for gas-oil system and oil-water system respectively. A representative Proxy equation was developed and sensitivity analysis carried out to determine the trend of critical rate as the oil column is been depleted. The result shows the trend in the movement of the GOC and OWC, and the critical rate, beyond which will result in excessive water and gas production, resulting to decreasing oil production from the reservoir. This result of this study can be used as a first pass assessment in the development of mature oil field reservoirs anticipated to experience water and/or gas coning during production.

Keywords: coning, fluid contact movement, mature oil field, oil production

Procedia PDF Downloads 245
2918 Modified Side Plate Design to Suppress Lateral Torsional Buckling of H-Beam for Seismic Application

Authors: Erwin, Cheng-Cheng Chen, Charles J. Salim

Abstract:

One of the method to solve the lateral torsional buckling (LTB) problem is by using side plates to increased the buckling resistance of the beam. Some modifications in designing the side plates are made in this study to simplify the construction in the field and reduce the cost. At certain region, side plates are not added: (1) At the beam end to preserve some spaces for bolt installation, but the beam is strengthened by adding cover plate at both flanges and (2) at the middle span of the beam where the moment is smaller. Three small scale full span beam specimens are tested under cyclic loading to investigate the LTB resistant and the ductility of the proposed design method. Test results show that the LTB deformation can be effectively suppressed and very high ductility level can be achieved. Following the test, a finite element analysis (FEA) model is established and is verified using the test results. An intensive parametric study is conducted using the established FEA model. The analysis reveals that the length of side plates is the most important parameter determining the performance of the beam and the required side plates length is determined by some parameters which are (1) beam depth to flange width ratio, (2) beam slenderness ratio (3) strength and thickness of the side plates, (4) compactness of beam web and flange, and (5) beam yield strength. At the end of the paper, a design formula to calculate the required side plate length is suggested.

Keywords: cover plate, earthquake resistant design, lateral torsional buckling, side plate, steel structure

Procedia PDF Downloads 176
2917 A Comparative Study on Deep Learning Models for Pneumonia Detection

Authors: Hichem Sassi

Abstract:

Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.

Keywords: deep learning, computer vision, pneumonia, models, comparative study

Procedia PDF Downloads 65
2916 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 307
2915 Sustainable Design for Building Envelope in Hot Climates: A Case Study for the Role of the Dome as a Component of an Envelope in Heat Exchange

Authors: Akeel Noori Almulla Hwaish

Abstract:

Architectural design is influenced by the actual thermal behaviour of building components, and this in turn depends not only on their steady and periodic thermal characteristics, but also on exposure effects, orientation, surface colour, and climatic fluctuations at the given location. Design data and environmental parameters should be produced in an accurate way for specified locations, so that architects and engineers can confidently apply them in their design calculations that enable precise evaluation of the influence of various parameters relating to each component of the envelope, which indicates overall thermal performance of building. The present paper will be carried out with an objective of thermal behaviour assessment and characteristics of the opaque and transparent parts of one of the very unique components used as a symbolic distinguished element of building envelope, its thermal behaviour under the impact of solar temperatures, and its role in heat exchange related to a specific U-value of specified construction materials alternatives. The research method will consider the specified Hot-Dry weather and new mosque in Baghdad, Iraq as a case study. Also, data will be presented in light of the criteria of indoor thermal comfort in terms of design parameters and thermal assessment for a“model dome”. Design alternatives and considerations of energy conservation, will be discussed as well using comparative computer simulations. Findings will be incorporated to outline the conclusions clarifying the important role of the dome in heat exchange of the whole building envelope for approaching an indoor thermal comfort level and further research in the future.

Keywords: building envelope, sustainable design, dome impact, hot-climates, heat exchange

Procedia PDF Downloads 475
2914 The Effect of Self and Peer Assessment Activities in Second Language Writing: A Washback Effect Study on the Writing Growth during the Revision Phase in the Writing Process: Learners’ Perspective

Authors: Musbah Abdussayed

Abstract:

The washback effect refers to the influence of assessment on teaching and learning, and this washback effect can either be positive or negative. This study implemented, sequentially, self-assessment (SA) and peer assessment (PA) and examined the washback effect of self and peer assessment (SPA) activities on the writing growth during the revision phase in the writing process. Twenty advanced Arabic as a second language learners from a private school in the USA participated in the study. The participants composed and then revised a short Arabic story as a part of a midterm grade. Qualitative data was collected, analyzed, and synthesized from ten interviews with the learners and from the twenty learners’ post-reflective journals. The findings indicate positive washback effects on the learners’ writing growth. The PA activity enhanced descriptions and meaning, promoted creativity, and improved textual coherence, whereas the SA activity led to detecting editing issues. Furthermore, both SPA activities had washback effects in common, including helping the learners meet the writing genre conventions and developing metacognitive awareness. However, the findings also demonstrate negative washback effects on the learners’ attitudes during the revision phase in the writing process, including bias toward self-evaluation during the SA activity and reluctance to rate peers’ writing performance during the PA activity. The findings suggest that self-and peer assessment activities are essential teaching and learning tools that can be utilized sequentially to help learners tackle multiple writing areas during the revision phase in the writing process.

Keywords: self assessment, peer assessment, washback effect, second language writing, writing process

Procedia PDF Downloads 72
2913 Association of Sociodemographic Factors and Loneliness of Adolescents in China

Authors: Zihan Geng, Yifan Hou

Abstract:

Background: Loneliness is the feeling of being isolated, which is becoming increasingly common among adolescents. A cross-sectional study was performed to determine the association between loneliness and different demographics. Methods: To identify the presence of loneliness, the UCLA Loneliness Scale (Version 3) was employed. The "Questionnaire Star" in Chinese version, as the online survey on the official website, was used to distribute the self-rating questionnaires to the students in Beijing from Grade 7 to Grade 12. The questionnaire includes sociodemographic items and the UCLA Loneliness Scale. Results: Almost all of the participants exhibited “caseness” for loneliness, as defined by UCLA. Out of 266 questionnaires, 2.6% (7 in 266) students fulfilled the presence criteria for a low degree of loneliness. 29.7% (79 in 266) of adolescents met the criteria for a moderate degree of loneliness. Moreover, 62.8% (167 in 266) and 4.9% (13 in 266) of students fulfilled the presence criteria for a moderately high and high degree of loneliness, respectively. In the Pearson χ2 test, there were significant associations between loneliness and some demographic factors, including grade (P<0.001), the number of adults in the family (P=0.001), the evaluation of appearance (P=0.034), the evaluation of self-satisfaction (P<0.001), the love in family (P<0.001), academic performance (P=0.001) and emotional support from friends (P<0.001). In the multivariate logistic analysis, the number of adults (2 vs.≤1, OR=0.319, P=0.015), time spent on social media (≥4h vs. ≤1h, OR=4.862, P=0.029), emotional support of friends (more satisfied vs. dissatisfied, OR=0.363, P=0.027) were associated with loneliness. Conclusions: Our results suggest the relationship between loneliness and some sociodemographic factors, which raise the possibility to reduce the loneliness among adolescents. Therefore, the companionship of family, the encouragement from friends and regulating the time spent on social media may decrease the loneliness in adolescents.

Keywords: loneliness, adolescents, demographic factors, UCLA loneliness scale

Procedia PDF Downloads 79
2912 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 44
2911 Modeling Pronunciations of Arab Broca’s Aphasics Using Mosstalk Words Technique

Authors: Sadeq Al Yaari, Fayza Alhammadi, Ayman Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Saleh Al Yami

Abstract:

Background: There has been a debate in the literature over the years as to whether or not MossTalk Words program fits Arab Broca’s aphasics (BAs) due to that language differences and also the fact that the technique has not yet been used for aphasics with semantic dementia (SD aphasics). Aims: To oversimplify the above mentioned debate slightly for purposes of exposition, the purpose of the present study is to investigate the “usability” of this program as well as pictures and community as therapeutic techniques for both Arab BAs and SD aphasics. Method: The subjects of this study are two Saudi aphasics (53 and 57 years old, respectively). The former suffers from Broca’s aphasia due to a stroke, while the latter suffers from semantic dementia. Both aphasics can speak English and have used the Moss Talk Words program in addition to intensive picture-naming therapeutic sessions for two years. They were tested by one of the researchers four times (a time per six months). The families of the two subjects, in addition to their relatives and friends, played a major part in all therapeutic sessions. Conclusion: Results show that in averages across the entire therapeutic sessions, MossTalk Words program was clearly found more effective in modeling BAs’ pronunciation than that of SD aphasic. Furthermore, picture-naming intensive exercises in addition to the positive role of the community members played a major role in the progress of the two subjects’ performance.

Keywords: moss talk words, program, technique, Broca’s aphasia, semantic dementia, subjects, picture, community

Procedia PDF Downloads 47
2910 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 225
2909 Creative Element Analysis of Machinery Creativity Contest Works

Authors: Chin-Pin, Chen, Shi-Chi, Shiao, Ting-Hao, Lin

Abstract:

Current industry is facing the rapid development of new technology in the world and fierce changes of economic environment in the society so that the industry development trend gradually does not focus on labor, but leads the industry and the academic circle with innovation and creativity. The development trend in machinery industry presents the same situation. Based on the aim of Creativity White Paper, Ministry of Education in Taiwan promotes and develops various creativity contests to cope with the industry trend. Domestic students and enterprises have good performance on domestic and international creativity contests in recent years. There must be important creative elements in such creative works to win the award among so many works. Literature review and in-depth interview with five creativity contest awarded instructors are first proceeded to conclude 15 machinery creative elements, which are further compared with the creative elements of machinery awarded creative works in past five years to understand the relationship between awarded works and creative elements. The statistical analysis results show that IDEA (Industrial Design Excellence Award) contains the most creative elements among four major international creativity contests. That is, most creativity review focuses on creative elements that are comparatively stricter. Concerning the groups participating in creativity contests, enterprises consider more creative elements of the creative works than other two elements for contests. From such contest works, creative elements of “replacement or improvement”, “convenience”, and “modeling” present higher significance. It is expected that the above findings could provide domestic colleges and universities with reference for participating in creativity related contests in the future.

Keywords: machinery, creative elements, creativity contest, creativity works

Procedia PDF Downloads 443
2908 Architectural Design Strategies: Enhance Train Station Performance as the Catalyst of Transit Oriented Development in Jakarta, Case Study of Beos Commuter Line Station

Authors: Shinta Ardiana Sari, Dini Puti Angelia

Abstract:

A high number of urban population in Jakarta has been a substantial issue for mobility strategy. Transit Oriented Development (TOD) becomes one of the strategies to improve community livability based on the design of transit place and the system of its context. TOD principle is trying to win over pedestrian motorization habit, makes people would rather transit and travel more than using private vehicle. Train station takes the main role as the catalyst to emerge TOD, in Jakarta this role will be taken by Commuter line and the future MRT. For advancing its development, architectural design perspective is needed to perform evaluation while seeking for the strategies between accessibility transportation modes with convenience and safety for increasing human behavioral intention. This paper discovers design strategy for transit place that appropriates with Jakarta condition use the basic theories of liminal space and transit-oriented development goal. This paper use evidence-based approach with typology method to analyze the present condition of Commuter Line station in Jakarta and precedent of Asian Cities, Tokyo and Seoul, as the secondary sources, and also with numbers of valid questionnaires. Furthermore, the result of this paper aims to the emerging of a transit-oriented community by giving design requirements and suggestion transportation policies preparing for the operational of MRT in the future Jakarta and other similar cities.

Keywords: station design, transit place, transit-oriented development, urban

Procedia PDF Downloads 220
2907 Sorption Properties of Biological Waste for Lead Ions from Aqueous Solutions

Authors: Lucia Rozumová, Ivo Šafařík, Jana Seidlerová, Pavel Kůs

Abstract:

Biosorption by biological waste materials from agriculture industry could be a cost-effective technique for removing metal ions from wastewater. The performance of new biosorbent systems, consisting of the waste matrixes which were magnetically modified by iron oxide nanoparticles, for the removal of lead ions from an aqueous solution was tested. The use of low-cost and eco-friendly adsorbents has been investigated as an ideal alternative to the current expensive methods. This article deals with the removal of metal ions from aqueous solutions by modified waste products - orange peels, sawdust, peanuts husks, used tea leaves and ground coffee sediment. Magnetically modified waste materials were suspended in methanol and then was added ferrofluid (magnetic iron oxide nanoparticles). This modification process gives the predictions for the formation of the smart materials with new properties. Prepared material was characterized by using scanning electron microscopy, specific surface area and pore size analyzer. Studies were focused on the sorption and desorption properties. The changes of iron content in magnetically modified materials after treatment were observed as well. Adsorption process has been modelled by adsorption isotherms. The results show that magnetically modified materials during the dynamic sorption and desorption are stable at the high adsorbed amount of lead ions. The results of this study indicate that the biological waste materials as sorbent with new properties are highly effective for the treatment of wastewater.

Keywords: biological waste, sorption, metal ions, ferrofluid

Procedia PDF Downloads 142
2906 Study of Aging Behavior of Parallel-Series Connection Batteries

Authors: David Chao, John Lai, Alvin Wu, Carl Wang

Abstract:

For lithium-ion batteries with multiple cell configurations, some use scenarios can cause uneven aging effects to each cell within the battery because of uneven current distribution. Hence the focus of the study is to explore the aging effect(s) on batteries with different construction designs. In order to systematically study the influence of various factors in some key battery configurations, a detailed analysis of three key battery construction factors is conducted. And those key factors are (1) terminal position; (2) cell alignment matrix; and (3) interconnect resistance between cells. In this study, the 2S2P circuitry has been set as a model multi-cell battery to set up different battery samples, and the aging behavior is studied by a cycling test to analyze the current distribution and recoverable capacity. According to the outcome of aging tests, some key findings are: (I) different cells alignment matrices can have an impact on the cycle life of the battery; (II) symmetrical structure has been identified as a critical factor that can influence the battery cycle life, and unbalanced resistance can lead to inconsistent cell aging status; (III) the terminal position has been found to contribute to the uneven current distribution, that can cause an accelerated battery aging effect; and (IV) the internal connection resistance increase can actually result in cycle life increase; however, it is noteworthy that such increase in cycle life is accompanied by a decline in battery performance. In summary, the key findings from the study can help to identify the key aging factor of multi-cell batteries, and it can be useful to effectively improve the accuracy of battery capacity predictions.

Keywords: multiple cells battery, current distribution, battery aging, cell connection

Procedia PDF Downloads 83
2905 Chemical Fingerprinting of Complex Samples With the Aid of Parallel Outlet Flow Chromatography

Authors: Xavier A. Conlan

Abstract:

Speed of analysis is a significant limitation to current high-performance liquid chromatography/mass spectrometry (HPLC/MS) and ultra-high-pressure liquid chromatography (UHPLC)/MS systems both of which are used in many forensic investigations. The flow rate limitations of MS detection require a compromise in the chromatographic flow rate, which in turn reduces throughput, and when using modern columns, a reduction in separation efficiency. Commonly, this restriction is combated through the post-column splitting of flow prior to entry into the mass spectrometer. However, this results in a loss of sensitivity and a loss in efficiency due to the post-extra column dead volume. A new chromatographic column format known as 'parallel segmented flow' involves the splitting of eluent flow within the column outlet end fitting, and in this study we present its application in order to interrogate the provenience of methamphetamine samples with mass spectrometry detection. Using parallel segmented flow, column flow rates as high as 3 mL/min were employed in the analysis of amino acids without post-column splitting to the mass spectrometer. Furthermore, when parallel segmented flow chromatography columns were employed, the sensitivity was more than twice that of conventional systems with post-column splitting when the same volume of mobile phase was passed through the detector. These finding suggest that this type of column technology will particularly enhance the capabilities of modern LC/MS enabling both high-throughput and sensitive mass spectral detection.

Keywords: chromatography, mass spectrometry methamphetamine, parallel segmented outlet flow column, forensic sciences

Procedia PDF Downloads 493
2904 Design of Replication System for Computer-Generated Hologram in Optical Component Application

Authors: Chih-Hung Chen, Yih-Shyang Cheng, Yu-Hsin Tu

Abstract:

Holographic optical elements (HOEs) have recently been one of the most suitable components in optoelectronic technology owing to the requirement of the product system with compact size. Computer-generated holography (CGH) is a well-known technology for HOEs production. In some cases, a well-designed diffractive optical element with multifunctional components is also an important issue and needed for an advanced optoelectronic system. Spatial light modulator (SLM) is one of the key components that has great capability to display CGH pattern and is widely used in various applications, such as an image projection system. As mentioned to multifunctional components, such as phase and amplitude modulation of light, high-resolution hologram with multiple-exposure procedure is also one of the suitable candidates. However, holographic recording under multiple exposures, the diffraction efficiency of the final hologram is inevitably lower than that with single exposure process. In this study, a two-step holographic recording method, including the master hologram fabrication and the replicated hologram production, will be designed. Since there exist a reduction factor M² of diffraction efficiency in multiple-exposure holograms (M multiple exposures), so it seems that single exposure would be more efficient for holograms replication. In the second step of holographic replication, a stable optical system with one-shot copying is introduced. For commercial application, one may utilize this concept of holographic copying to obtain duplications of HOEs with higher optical performance.

Keywords: holographic replication, holography, one-shot copying, optical element

Procedia PDF Downloads 157