Search results for: efficiency test
11207 The Potential in the Use of Building Information Modelling and Life-Cycle Assessment for Retrofitting Buildings: A Study Based on Interviews with Experts in Both Fields
Authors: Alex Gonzalez Caceres, Jan Karlshøj, Tor Arvid Vik
Abstract:
Life cycle of residential buildings are expected to be several decades, 40% of European residential buildings have inefficient energy conservation measure. The existing building represents 20-40% of the energy use and the CO₂ emission. Since net zero energy buildings are a short-term goal, (should be achieved by EU countries after 2020), is necessary to plan the next logical step, which is to prepare the existing outdated stack of building to retrofit them into an energy efficiency buildings. In order to accomplish this, two specialize and widespread tool can be used Building Information Modelling (BIM) and life-cycle assessment (LCA). BIM and LCA are tools used by a variety of disciplines; both are able to represent and analyze the constructions in different stages. The combination of these technologies could improve greatly the retrofitting techniques. The incorporation of the carbon footprint, introducing a single database source for different material analysis. To this is added the possibility of considering different analysis approaches such as costs and energy saving. Is expected with these measures, enrich the decision-making. The methodology is based on two main activities; the first task involved the collection of data this is accomplished by literature review and interview with experts in the retrofitting field and BIM technologies. The results of this task are presented as an evaluation checklist of BIM ability to manage data and improve decision-making in retrofitting projects. The last activity involves an evaluation using the results of the previous tasks, to check how far the IFC format can support the requirements by each specialist, and its uses by third party software. The result indicates that BIM/LCA have a great potential to improve the retrofitting process in existing buildings, but some modification must be done in order to meet the requirements of the specialists for both, retrofitting and LCA evaluators.Keywords: retrofitting, BIM, LCA, energy efficiency
Procedia PDF Downloads 22011206 Effectiveness of Self-Learning Module on the Academic Performance of Students in Statistics and Probability
Authors: Aneia Rajiel Busmente, Renato Gunio Jr., Jazin Mautante, Denise Joy Mendoza, Raymond Benedict Tagorio, Gabriel Uy, Natalie Quinn Valenzuela, Ma. Elayza Villa, Francine Yezha Vizcarra, Sofia Madelle Yapan, Eugene Kurt Yboa
Abstract:
COVID-19’s rapid spread caused a dramatic change in the nation, especially the educational system. The Department of Education was forced to adopt a practical learning platform without neglecting health, a printed modular distance learning. The Philippines' K–12 curriculum includes Statistics and Probability as one of the key courses as it offers students the knowledge to evaluate and comprehend data. Due to student’s difficulty and lack of understanding of the concepts of Statistics and Probability in Normal Distribution. The Self-Learning Module in Statistics and Probability about the Normal Distribution created by the Department of Education has several problems, including many activities, unclear illustrations, and insufficient examples of concepts which enables learners to have a difficulty accomplishing the module. The purpose of this study is to determine the effectiveness of self-learning module on the academic performance of students in the subject Statistics and Probability, it will also explore students’ perception towards the quality of created Self-Learning Module in Statistics and Probability. Despite the availability of Self-Learning Modules in Statistics and Probability in the Philippines, there are still few literatures that discuss its effectiveness in improving the performance of Senior High School students in Statistics and Probability. In this study, a Self-Learning Module on Normal Distribution is evaluated using a quasi-experimental design. STEM students in Grade 11 from National University's Nazareth School will be the study's participants, chosen by purposive sampling. Google Forms will be utilized to find at least 100 STEM students in Grade 11. The research instrument consists of 20-item pre- and post-test to assess participants' knowledge and performance regarding Normal Distribution, and a Likert scale survey to evaluate how the students perceived the self-learning module. Pre-test, post-test, and Likert scale surveys will be utilized to gather data, with Jeffreys' Amazing Statistics Program (JASP) software being used for analysis.Keywords: self-learning module, academic performance, statistics and probability, normal distribution
Procedia PDF Downloads 11411205 Innovative Tool for Improving Teaching and Learning
Authors: Izharul Haq
Abstract:
Every one of us seek to aspire to gain quality education. The biggest stake holders are students who labor through years acquiring knowledge and skill to help them prepare for their career. Parents spend a fortune on their children’s education. Companies spend billions of dollars to enhance standards by developing new education products and services. Quality education is the golden key to a long lasting prosperity for the individual and the nation. But unfortunately, education standards are continuously deteriorating and it has become a global phenomenon. Unfortunately, teaching is often described as a ‘popularity contest’ and those teachers who are usually popular with students are often those who compromise teaching to appease students. Such teachers also ‘teach-to-the-test’ ensuring high test scores. Such teachers, hence, receive good student rating. Teachers who are conscientious, rigorous and thorough are often the victims of good appraisal. Government and private organizations are spending billions of dollars trying to capture the characteristics of a good teacher. But the results are still vague and inconclusive. At present there is no objective way to measure teaching effectiveness. In this paper we present an innovative method to objectively measure teaching effectiveness using a new teaching tool (TSquare). The TSquare tool used in the study is practical, easy to use, cost effective and requires no special equipment to implement. Hence it has a global appeal for poor and the rich countries alike.Keywords: measuring teaching effectiveness, quality in education, student learning, teaching styles
Procedia PDF Downloads 29611204 Design and Thermal Analysis of Power Harvesting System of a Hexagonal Shaped Small Spacecraft
Authors: Mansa Radhakrishnan, Anwar Ali, Muhammad Rizwan Mughal
Abstract:
Many universities around the world are working on modular and low budget architecture of small spacecraft to reduce the development cost of the overall system. This paper focuses on the design of a modular solar power harvesting system for a hexagonal-shaped small satellite. The designed solar power harvesting systems are composed of solar panels and power converter subsystems. The solar panel is composed of solar cells mounted on the external face of the printed circuit board (PCB), while the electronic components of power conversion are mounted on the interior side of the same PCB. The solar panel with dimensions 16.5cm × 99cm is composed of 36 solar cells (each solar cell is 4cm × 7cm) divided into four parallel banks where each bank consists of 9 solar cells. The output voltage of a single solar cell is 2.14V, and the combined output voltage of 9 series connected solar cells is around 19.3V. The output voltage of the solar panel is boosted to the satellite power distribution bus voltage level (28V) by a boost converter working on a constant voltage maximum power point tracking (MPPT) technique. The solar panel module is an eight-layer PCB having embedded coil in 4 internal layers. This coil is used to control the attitude of the spacecraft, which consumes power to generate a magnetic field and rotate the spacecraft. As power converter and distribution subsystem components are mounted on the PCB internal layer, therefore it is mandatory to do thermal analysis in order to ensure that the overall module temperature is within thermal safety limits. The main focus of the overall design is on compactness, miniaturization, and efficiency enhancement.Keywords: small satellites, power subsystem, efficiency, MPPT
Procedia PDF Downloads 7411203 Enhance Construction Visual As-Built Schedule Management Using BIM Technology
Authors: Shu-Hui Jan, Hui-Ping Tserng, Shih-Ping Ho
Abstract:
Construction project control attempts to obtain real-time as-built schedule information and to eliminate project delays by effectively enhancing dynamic schedule control and management. Suitable platforms for enhancing an as-built schedule visually during the construction phase are necessary and important for general contractors. As the application of building information modeling (BIM) becomes more common, schedule management integrated with the BIM approach becomes essential to enhance visual construction management implementation for the general contractor during the construction phase. To enhance visualization of the updated as-built schedule for the general contractor, this study presents a novel system called the Construction BIM-assisted Schedule Management (ConBIM-SM) system for general contractors in
Keywords: building information modeling (BIM), construction schedule management, as-built schedule management, BIM schedule updating mechanism
Procedia PDF Downloads 37511202 Behaviour of Hollow Tubes Filled with Sand Slag Concrete
Authors: Meriem Senani, Noureedine Ferhoune
Abstract:
This paper presents the axial bearing capacity of thin welded rectangular steel stubs filled with concrete sand. A series of tests was conducted to study the behavior of short composite columns under axial compressive load, the cross section dimensions were: 100x70x2 mm. A total of 16 stubs have been tested, as follows: 4 filled with ordinary concrete appointed by BO columns, 6 filled with concrete witch natural sand was completely substitute a crystallized sand slag designated in this paper by BSI, and 6 others were tucked in concrete whose natural sand was partially replace by a crystallized sand slag called by BSII. The main objectives of these tests were to clarify the steel specimen's performance filled by concrete sand compared to those filled with ordinary concrete. The main parameters studied are: The height of the specimen (300mm-500mm), eccentricity of load and type of filling concrete. Based on test results obtained, it is confirmed that the length of the tubes, has a considerable effect on the bearing capacity and the failure mode. In all test tubes, fracture occurred by the convex warping of the largest, followed by the smallest due to the outward thrust of the concrete, it was observed that the sand concrete improves the bearing capacity of tubes compounds compared to those filled with ordinary concrete.Keywords: concrete sand, crystallized slag, failure mode, buckling
Procedia PDF Downloads 41411201 Safety and Efficacy of Recombinant Clostridium botulinum Types B Vaccine Candidate
Authors: Mi-Hye Hwang, Young Min Son, Kichan Lee, Bang-Hun Hyun, Byeong Yeal Jung
Abstract:
Botulism is a paralytic disease of human beings and animals caused by neurotoxin produced by Clostridium botulinum. The neurotoxins are genetically distinguished into 8 types, A to H. Ingestion of performed toxin, usually types B, C, and D, have been shown to produce diseases in most cases of cattle botulism. Vaccination is the best measure to prevent cattle botulism. However, the commercially available toxoid-based vaccines are difficult and hazardous to produce. We produced recombinant protein using gene of heavy chain domain of botulinum toxin B of which binds to cellular receptor of neuron cells and used as immunogen. In this study, we evaluated the safety and efficacy of botulism vaccine composed of recombinant types B. Safety test was done by National Regulation for Veterinary Biologicals. For efficacy test, female ICR mice (5 weeks old) were subcutaneously injected, intraperitoneally challenged, and examined the survival rates compared with vaccination and non-vaccination group. Mouse survival rate of recombinant types B vaccine was above 80%, while one of non-vaccination group was 0%. A vaccine composed of recombinant types B was safe and efficacious in mouse. Our results suggest that recombinant heavy chain receptor binding domain can be used as an effective vaccine candidate for type B botulism.Keywords: botulism, livestock, vaccine, recombinant protein, toxin
Procedia PDF Downloads 23911200 Integrated Design of Froth Flotation Process in Sludge Oil Recovery Using Cavitation Nanobubbles for Increase the Efficiency and High Viscose Compatibility
Authors: Yolla Miranda, Marini Altyra, Karina Kalmapuspita Imas
Abstract:
Oily sludge wastes always fill in upstream and downstream petroleum industry process. Sludge still contains oil that can use for energy storage. Recycling sludge is a method to handling it for reduce the toxicity and very probable to get the remaining oil around 20% from its volume. Froth flotation, a common method based on chemical unit for separate fine solid particles from an aqueous suspension. The basic composition of froth flotation is the capture of oil droplets or small solids by air bubbles in an aqueous slurry, followed by their levitation and collection in a froth layer. This method has been known as no intensive energy requirement and easy to apply. But the low efficiency and unable treat the high viscosity become the biggest problem in froth flotation unit. This study give the design to manage the high viscosity of sludge first and then entering the froth flotation including cavitation tube on it to change the bubbles into nano particles. The recovery in flotation starts with the collision and adhesion of hydrophobic particles to the air bubbles followed by transportation of the hydrophobic particle-bubble aggregate from the collection zone to the froth zone, drainage and enrichment of the froth, and finally by its overflow removal from the cell top. The effective particle separation by froth flotation relies on the efficient capture of hydrophobic particles by air bubbles in three steps. The important step is collision. Decreasing the bubble particles will increasing the collision effect. It cause the process more efficient. The pre-treatment, froth flotation, and cavitation tube integrated each other. The design shows the integrated unit and its process.Keywords: sludge oil recovery, froth flotation, cavitation tube, nanobubbles, high viscosity
Procedia PDF Downloads 37811199 Is Electricity Consumption Stationary in Turkey?
Authors: Eyup Dogan
Abstract:
The number of research articles analyzing the integration properties of energy variables has rapidly increased in the energy literature for about a decade. The stochastic behaviors of energy variables are worth knowing due to several reasons. For instance, national policies to conserve or promote energy consumption, which should be taken as shocks to energy consumption, will have transitory effects in energy consumption if energy consumption is found to be stationary in one country. Furthermore, it is also important to know the order of integration to employ an appropriate econometric model. Despite being an important subject for applied energy (economics) and having a huge volume of studies, several known limitations still exist with the existing literature. For example, many of the studies use aggregate energy consumption and national level data. In addition, a huge part of the literature is either multi-country studies or solely focusing on the U.S. This is the first study in the literature that considers a form of energy consumption by sectors at sub-national level. This research study aims at investigating unit root properties of electricity consumption for 12 regions of Turkey by four sectors in addition to total electricity consumption for the purpose of filling the mentioned limits in the literature. In this regard, we analyze stationarity properties of 60 cases . Because the use of multiple unit root tests make the results robust and consistent, we apply Dickey-Fuller unit root test based on Generalized Least Squares regression (DFGLS), Phillips-Perron unit root test (PP) and Zivot-Andrews unit root test with one endogenous structural break (ZA). The main finding of this study is that electricity consumption is trend stationary in 7 cases according to DFGLS and PP, whereas it is stationary process in 12 cases when we take into account the structural change by applying ZA. Thus, shocks to electricity consumption have transitory effects in those cases; namely, agriculture in region 1, region 4 and region 7, industrial in region 5, region 8, region 9, region 10 and region 11, business in region 4, region 7 and region 9, total electricity consumption in region 11. Regarding policy implications, policies to decrease or stimulate the use of electricity have a long-run impact on electricity consumption in 80% of cases in Turkey given that 48 cases are non-stationary process. On the other hand, the past behavior of electricity consumption can be used to predict the future behavior of that in 12 cases only.Keywords: unit root, electricity consumption, sectoral data, subnational data
Procedia PDF Downloads 41011198 Adaptive Energy-Aware Routing (AEAR) for Optimized Performance in Resource-Constrained Wireless Sensor Networks
Authors: Innocent Uzougbo Onwuegbuzie
Abstract:
Wireless Sensor Networks (WSNs) are crucial for numerous applications, yet they face significant challenges due to resource constraints such as limited power and memory. Traditional routing algorithms like Dijkstra, Ad hoc On-Demand Distance Vector (AODV), and Bellman-Ford, while effective in path establishment and discovery, are not optimized for the unique demands of WSNs due to their large memory footprint and power consumption. This paper introduces the Adaptive Energy-Aware Routing (AEAR) model, a solution designed to address these limitations. AEAR integrates reactive route discovery, localized decision-making using geographic information, energy-aware metrics, and dynamic adaptation to provide a robust and efficient routing strategy. We present a detailed comparative analysis using a dataset of 50 sensor nodes, evaluating power consumption, memory footprint, and path cost across AEAR, Dijkstra, AODV, and Bellman-Ford algorithms. Our results demonstrate that AEAR significantly reduces power consumption and memory usage while optimizing path weight. This improvement is achieved through adaptive mechanisms that balance energy efficiency and link quality, ensuring prolonged network lifespan and reliable communication. The AEAR model's superior performance underlines its potential as a viable routing solution for energy-constrained WSN environments, paving the way for more sustainable and resilient sensor network deployments.Keywords: wireless sensor networks (WSNs), adaptive energy-aware routing (AEAR), routing algorithms, energy, efficiency, network lifespan
Procedia PDF Downloads 3611197 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 18011196 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit
Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic
Abstract:
Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method
Procedia PDF Downloads 12011195 Comparing the Effectiveness of Social Skills Training and Stress Management on Self Esteem and Agression in First Grade Students of Iranian West High School
Authors: Hossein Nikandam Kermanshah, Babak Samavatian, Akbar Hemmati Sabet, Mohammad Ahmadpanah
Abstract:
This is a quasi-experimental study that has been conducted in order to compare the effectiveness of social skills training and stress management training on self-esteem and aggression in first grade high school students. Forty-five people were selected from research community and were put randomly in there groups of social skills training, stress management training and control ones. Collecting data tools in this study was devise, self-esteem and AGQ aggression questionnaire. Self-esteem and aggression questionnaires has been conducted as the pre-test and post-test. Social skills training and stress management groups participated in eight 1.5 hour session in a week. But control group did not receive any therapy. For descriptive analysis of data, statistical indicators like mean, standard deviation were used, and in inferential statistics level multi variable covariance analysis have been used. The finding result show that group training social skills and stress management is significantly effective on the self-esteem and aggression, there is a meaningful difference between training social skills and stress management on self-esteem that the preference is with group social skills training, in the difference between group social skills training and stress management on aggression, the preference is with group stress management.Keywords: social skill training, stress management training, self-esteem aggression, psychological sciences
Procedia PDF Downloads 46911194 Formulation and Evaluation of Curcumin-Zn (II) Microparticulate Drug Delivery System for Antimalarial Activity
Authors: M. R. Aher, R. B. Laware, G. S. Asane, B. S. Kuchekar
Abstract:
Objective: Studies have shown that a new combination therapy with Artemisinin derivatives and curcumin is unique, with potential advantages over known ACTs. In present study an attempt was made to prepare microparticulate drug delivery system of Curcumin-Zn complex and evaluate it in combination with artemether for antimalarial activity. Material and method: Curcumin Zn complex was prepared and encapsulated using sodium alginate. Microparticles thus obtained are further coated with various enteric polymers at different coating thickness to control the release. Microparticles are evaluated for encapsulation efficiency, drug loading and in vitro drug release. Roentgenographic Studies was conducted in rabbits with BaSO 4 tagged formulation. Optimized formulation was screened for antimalarial activity using P. berghei-infected mice survival test and % paracetemia inhibition, alone (three oral dose of 5mg/day) and in combination with arthemether (i.p. 500, 1000 and 1500µg). Curcumin-Zn(II) was estimated in serum after oral administration to rats by using spectroflurometry. Result: Microparticles coated with Cellulose acetate phthalate showed most satisfactory and controlled release with 479 min time for 60% drug release. X-ray images taken at different time intervals confirmed the retention of formulation in GI tract. Estimation of curcumin in serum by spectroflurometry showed that drug concentration is maintained in the blood for longer time with tmax of 6 hours. The survival time (40 days post treatment) of mice infected with P. berghei was compared to survival after treatment with either Curcumin-Zn(II) microparticles artemether combination, curcumin-Zn complex and artemether. Oral administration of Curcumin-Zn(II)-artemether prolonged the survival of P.berghei-infected mice. All the mice treated with Curcumin-Zn(II) microparticles (5mg/day) artemether (1000µg) survived for more than 40 days and recovered with no detectable parasitemia. Administration of Curcumin-Zn(II) artemether combination reduced the parasitemia in mice by more than 90% compared to that in control mice for the first 3 days after treatment. Conclusion: Antimalarial activity of the curcumin Zn-artemether combination was more pronounced than mono therapy. A single dose of 1000µg of artemether in curcumin-Zn combination gives complete protection in P. berghei-infected mice. This may reduce the chances of drug resistance in malaria management.Keywords: formulation, microparticulate drug delivery, antimalarial, pharmaceutics
Procedia PDF Downloads 39411193 Comparison of Rainfall Trends in the Western Ghats and Coastal Region of Karnataka, India
Authors: Vinay C. Doranalu, Amba Shetty
Abstract:
In recent days due to climate change, there is a large variation in spatial distribution of daily rainfall within a small region. Rainfall is one of the main end climatic variables which affect spatio-temporal patterns of water availability. The real task postured by the change in climate is identification, estimation and understanding the uncertainty of rainfall. This study intended to analyze the spatial variations and temporal trends of daily precipitation using high resolution (0.25º x 0.25º) gridded data of Indian Meteorological Department (IMD). For the study, 38 grid points were selected in the study area and analyzed for daily precipitation time series (113 years) over the period 1901-2013. Grid points were divided into two zones based on the elevation and situated location of grid points: Low Land (exposed to sea and low elevated area/ coastal region) and High Land (Interior from sea and high elevated area/western Ghats). Time series were applied to examine the spatial analysis and temporal trends in each grid points by non-parametric Mann-Kendall test and Theil-Sen estimator to perceive the nature of trend and magnitude of slope in trend of rainfall. Pettit-Mann-Whitney test is applied to detect the most probable change point in trends of the time period. Results have revealed remarkable monotonic trend in each grid for daily precipitation of the time series. In general, by the regional cluster analysis found that increasing precipitation trend in shoreline region and decreasing trend in Western Ghats from recent years. Spatial distribution of rainfall can be partly explained by heterogeneity in temporal trends of rainfall by change point analysis. The Mann-Kendall test shows significant variation as weaker rainfall towards the rainfall distribution over eastern parts of the Western Ghats region of Karnataka.Keywords: change point analysis, coastal region India, gridded rainfall data, non-parametric
Procedia PDF Downloads 29411192 Fabrication and Characterization of Folic Acid-Grafted-Thiomer Enveloped Liposomes for Enhanced Oral Bioavailability of Docetaxel
Authors: Farhan Sohail, Gul Shahnaz Irshad Hussain, Shoaib Sarwar, Ibrahim Javed, Zajif Hussain, Akhtar Nadhman
Abstract:
The present study was aimed to develop a hybrid nanocarrier (NC) system with enhanced membrane permeability, bioavailability and targeted delivery of Docetaxel (DTX) in breast cancer. Hybrid NC’s based on folic acid (FA) grafted thiolated chitosan (TCS) enveloped liposomes were prepared with DTX and evaluated in-vitro and in-vivo for their enhanced permeability and bioavailability. Physicochemical characterization of NC’s including particle size, morphology, zeta potential, FTIR, DSC, PXRD, encapsulation efficiency and drug release from NC’s was determined in vitro. Permeation enhancement and p-gp inhibition were performed through everted sac method on freshly excised rat intestine which indicated that permeation was enhanced 5 times as compared to pure DTX and the hybrid NC’s were strongly able to inhibit the p-gp activity as well. In-vitro cytotoxicity and tumor targeting was done using MDA-MB-231 cell line. The stability study of the formulations performed for 3 months showed the improved stability of FA-TCS enveloped liposomes in terms of its particles size, zeta potential and encapsulation efficiency as compared to TCS NP’s and liposomes. The pharmacokinetic study was performed in vivo using rabbits. The oral bioavailability and AUC0-96 was increased 10.07 folds with hybrid NC’s as compared to positive control. Half-life (t1/2) was increased 4 times (58.76 hrs) as compared to positive control (17.72 hrs). Conclusively, it is suggested that FA-TCS enveloped liposomes have strong potential to enhance permeability and bioavailability of hydrophobic drugs after oral administration and tumor targeting.Keywords: docetaxel, coated liposome, permeation enhancement, oral bioavailability
Procedia PDF Downloads 40811191 E-Vet Smart Rapid System: Detection of Farm Disease Based on Expert System as Supporting to Epidemic Disesase Control
Authors: Malik Abdul Jabbar Zen, Wiwik Misaco Yuniarti, Azisya Amalia Karimasari, Novita Priandini
Abstract:
Zoonos is as an infectiontransmitted froma nimals to human sand vice versa currently having increased in the last 20 years. The experts/scientists predict that zoonosis will be a threat to the community in the future since it leads on 70% emerging infectious diseases (EID) and the high mortality of 50%-90%. The zoonosis’ spread from animal to human is caused by contaminated food known as foodborne disease. One World One Health, as the conceptual prevention toward zoonosis, requires the crossed disciplines cooperation to accelerate and streamlinethe handling ofanimal-based disease. E-Vet Smart Rapid System is an integrated innovation in the veterinary expertise application is able to facilitate the prevention, treatment, and educationagainst pandemic diseases and zoonosis. This system is constructed by Decision Support System (DSS) method provides a database of knowledge that is expected to facilitate the identification of disease rapidly, precisely, and accurately as well as to identify the deduction. The testingis conducted through a black box test case and questionnaire (N=30) by validity and reliability approach. Based on the black box test case reveals that E-Vet Rapid System is able to deliver the results in accordance with system design, and questionnaire shows that this system is valid (r > 0.361) and has a reliability (α > 0.3610).Keywords: diagnosis, disease, expert systems, livestock, zoonosis
Procedia PDF Downloads 45511190 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network
Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui
Abstract:
Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN
Procedia PDF Downloads 13111189 Transforming Emergency Care: Revolutionizing Obstetrics and Gynecology Operations for Enhanced Excellence
Authors: Lolwa Alansari, Hanen Mrabet, Kholoud Khaled, Abdelhamid Azhaghdani, Sufia Athar, Aska Kaima, Zaineb Mhamdia, Zubaria Altaf, Almunzer Zakaria, Tamara Alshadafat
Abstract:
Introduction: The Obstetrics and Gynecology Emergency Department at Alwakra Hospital has faced significant challenges, which have been further worsened by the impact of the COVID-19 pandemic. These challenges involve issues such as overcrowding, extended wait times, and a notable surge in demand for emergency care services. Moreover, prolonged waiting times have emerged as a primary factor contributing to situations where patients leave without receiving attention, known as left without being seen (LWBS), and unexpectedly abscond. Addressing the issue of insufficient patient mobility in the obstetrics and gynecology emergency department has brought about substantial improvements in patient care, healthcare administration, and overall departmental efficiency. These changes have not only alleviated overcrowding but have also elevated the quality of emergency care, resulting in higher patient satisfaction, better outcomes, and operational rewards. Methodology: The COVID-19 pandemic has served as a catalyst for substantial transformations in the obstetrics and gynecology emergency, aligning seamlessly with the strategic direction of Hamad Medical Corporation (HMC). The fundamental aim of this initiative is to revolutionize the operational efficiency of the OB-GYN ED. To accomplish this mission, a range of transformations has been initiated, focusing on essential areas such as digitizing systems, optimizing resource allocation, enhancing budget efficiency, and reducing overall costs. The project utilized the Plan-Do-Study-Act (PDSA) model, involving a diverse team collecting baseline data and introducing throughput improvements. Post-implementation data and feedback were analysed, leading to the integration of effective interventions into standard procedures. These interventions included optimized space utilization, real-time communication, bedside registration, technology integration, pre-triage screening, enhanced communication and patient education, consultant presence, and a culture of continuous improvement. These strategies significantly reduced waiting times, enhancing both patient care and operational efficiency. Results: Results demonstrated a substantial reduction in overall average waiting time, dropping from 35 to approximately 14 minutes by August 2023. The wait times for priority 1 cases have been reduced from 22 to 0 minutes, and for priority 2 cases, the wait times have been reduced from 32 to approximately 13.6 minutes. The proportion of patients spending less than 8 hours in the OB ED observation beds rose from 74% in January 2022 to over 98% in 2023. Notably, there was a remarkable decrease in LWBS and absconded patient rates from 2020 to 2023. Conclusion: The project initiated a profound change in the department's operational environment. Efficiency became deeply embedded in the unit's culture, promoting teamwork among staff that went beyond the project's original focus and had a positive influence on operations in other departments. This effectiveness not only made processes more efficient but also resulted in significant cost reductions for the hospital. These cost savings were achieved by reducing wait times, which in turn led to fewer prolonged patient stays and reduced the need for additional treatments. These continuous improvement initiatives have now become an integral part of the Obstetrics and Gynecology Division's standard operating procedures, ensuring that the positive changes brought about by the project persist and evolve over time.Keywords: overcrowding, waiting time, person centered care, quality initiatives
Procedia PDF Downloads 6511188 A Rapid Reinforcement Technique for Columns by Carbon Fiber/Epoxy Composite Materials
Authors: Faruk Elaldi
Abstract:
There are lots of concrete columns and beams around in our living cities. Those columns are mostly open to aggressive environmental conditions and earthquakes. Mostly, they are deteriorated by sand, wind, humidity and other external applications at times. After a while, these beams and columns need to be repaired. Within the scope of this study, for reinforcement of concrete columns, samples were designed and fabricated to be strengthened with carbon fiber reinforced composite materials and conventional concrete encapsulation and followed by, and they were put into the axial compression test to determine load-carrying performance before column failure. In the first stage of this study, concrete column design and mold designs were completed for a certain load-carrying capacity. Later, the columns were exposed to environmental deterioration in order to reduce load-carrying capacity. To reinforce these damaged columns, two methods were applied, “concrete encapsulation” and the other one “wrapping with carbon fiber /epoxy” material. In the second stage of the study, the reinforced columns were applied to the axial compression test and the results obtained were analyzed. Cost and load-carrying performance comparisons were made and it was found that even though the carbon fiber/epoxy reinforced method is more expensive, this method enhances higher load-carrying capacity and reduces the reinforcement processing period.Keywords: column reinforcement, composite, earth quake, carbon fiber reinforced
Procedia PDF Downloads 18411187 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses
Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau
Abstract:
Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.Keywords: exam length, psychometric criteria, synthetic experimental designs, test length
Procedia PDF Downloads 27211186 Effects of in silico (Virtual Lab) And in vitro (inside the Classroom) Labs in the Academic Performance of Senior High School Students in General Biology
Authors: Mark Archei O. Javier
Abstract:
The Fourth Industrial Revolution (FIR) is a major industrial era characterized by the fusion of technologies that is blurring the lines between the physical, digital, and biological spheres. Since this era teaches us how to thrive in the fast-paced developing world, it is important to be able to adapt. With this, there is a need to make learning and teaching in the bioscience laboratory more challenging and engaging. The goal of the research is to find out if using in silico and in vitro laboratory activities compared to the conventional conduct laboratory activities would have positive impacts on the academic performance of the learners. The potential contribution of the research is that it would improve the teachers’ methods in delivering the content to the students when it comes to topics that need laboratory activities. This study will develop a method by which teachers can provide learning materials to the students. A one-tailed t-Test for independent samples was used to determine the significant difference in the pre- and post-test scores of students. The tests of hypotheses were done at a 0.05 level of significance. Based on the results of the study, the gain scores of the experimental group are greater than the gain scores of the control group. This implies that using in silico and in vitro labs for the experimental group is more effective than the conventional method of doing laboratory activities.Keywords: academic performance, general biology, in silico laboratory, in vivo laboratory, virtual laboratory
Procedia PDF Downloads 18911185 Extracting Polyhydroxyalkanoates from Waste Sludge of Husbandry Industry Wastewater Treatment Plants
Authors: M. S. Lu, Y. P. Tsai, H. Shu, K. F. Chen, L. L. Lai
Abstract:
This study used sodium hypochlorite/sodium dodecyl sulfate method to successfully extract polyhydroxyalkanoates (PHA) from the wasted sludge of a husbandry industry wastewater treatment plant. We investigated the optimum operational conditions of three key factors with respect to effectively extract PHAs from husbandry industry wastewater sludge, including the sodium hypochlorite concentration, liquid-solid ratio, and reaction time. The experimental results showed the optimum operational conditions for polyhydroxyalkanoate recovery as follows: (1) being digested by the sodium hypochlorite/sodium dodecyl sulfate solution with 15% (v/v) of hypochlorite concentration, (2) being operated at the condition of 1.25 mLmg-1 of liquid-solid ratio, and (3) being reacted for more than 60 min. Under these conditions, the content of the recovered PHAs was about 53.2±0.66 mgPHAs/gVSS, and the purity of the recovered PHAs was about 78.5±6.91 wt%. The recovered PHAs were further used to produce biodegradable plastics for decomposition test buried in soils. The decomposition test showed 66.5% of the biodegradable plastics produced in the study remained after being buried in soils for 49 days. The cost for extracting PHAs is about 10.3 US$/kgPHAs and is lower than those produced by pure culture methods (12-15 US$/kgPHAs).Keywords: biodegradable plastic, biopolymers, polyhydroxyalkanoates (PHAs), waste sludge
Procedia PDF Downloads 34411184 Optimizing Parallel Computing Systems: A Java-Based Approach to Modeling and Performance Analysis
Authors: Maher Ali Rusho, Sudipta Halder
Abstract:
The purpose of the study is to develop optimal solutions for models of parallel computing systems using the Java language. During the study, programmes were written for the examined models of parallel computing systems. The result of the parallel sorting code is the output of a sorted array of random numbers. When processing data in parallel, the time spent on processing and the first elements of the list of squared numbers are displayed. When processing requests asynchronously, processing completion messages are displayed for each task with a slight delay. The main results include the development of optimisation methods for algorithms and processes, such as the division of tasks into subtasks, the use of non-blocking algorithms, effective memory management, and load balancing, as well as the construction of diagrams and comparison of these methods by characteristics, including descriptions, implementation examples, and advantages. In addition, various specialised libraries were analysed to improve the performance and scalability of the models. The results of the work performed showed a substantial improvement in response time, bandwidth, and resource efficiency in parallel computing systems. Scalability and load analysis assessments were conducted, demonstrating how the system responds to an increase in data volume or the number of threads. Profiling tools were used to analyse performance in detail and identify bottlenecks in models, which improved the architecture and implementation of parallel computing systems. The obtained results emphasise the importance of choosing the right methods and tools for optimising parallel computing systems, which can substantially improve their performance and efficiency.Keywords: algorithm optimisation, memory management, load balancing, performance profiling, asynchronous programming.
Procedia PDF Downloads 1211183 Impact of Transitioning to Renewable Energy Sources on Key Performance Indicators and Artificial Intelligence Modules of Data Center
Authors: Ahmed Hossam ElMolla, Mohamed Hatem Saleh, Hamza Mostafa, Lara Mamdouh, Yassin Wael
Abstract:
Artificial intelligence (AI) is reshaping industries, and its potential to revolutionize renewable energy and data center operations is immense. By harnessing AI's capabilities, we can optimize energy consumption, predict fluctuations in renewable energy generation, and improve the efficiency of data center infrastructure. This convergence of technologies promises a future where energy is managed more intelligently, sustainably, and cost-effectively. The integration of AI into renewable energy systems unlocks a wealth of opportunities. Machine learning algorithms can analyze vast amounts of data to forecast weather patterns, solar irradiance, and wind speeds, enabling more accurate energy production planning. AI-powered systems can optimize energy storage and grid management, ensuring a stable power supply even during intermittent renewable generation. Moreover, AI can identify maintenance needs for renewable energy infrastructure, preventing costly breakdowns and maximizing system lifespan. Data centers, which consume substantial amounts of energy, are prime candidates for AI-driven optimization. AI can analyze energy consumption patterns, identify inefficiencies, and recommend adjustments to cooling systems, server utilization, and power distribution. Predictive maintenance using AI can prevent equipment failures, reducing energy waste and downtime. Additionally, AI can optimize data placement and retrieval, minimizing energy consumption associated with data transfer. As AI transforms renewable energy and data center operations, modified Key Performance Indicators (KPIs) will emerge. Traditional metrics like energy efficiency and cost-per-megawatt-hour will continue to be relevant, but additional KPIs focused on AI's impact will be essential. These might include AI-driven cost savings, predictive accuracy of energy generation and consumption, and the reduction of carbon emissions attributed to AI-optimized operations. By tracking these KPIs, organizations can measure the success of their AI initiatives and identify areas for improvement. Ultimately, the synergy between AI, renewable energy, and data centers holds the potential to create a more sustainable and resilient future. By embracing these technologies, we can build smarter, greener, and more efficient systems that benefit both the environment and the economy.Keywords: data center, artificial intelligence, renewable energy, energy efficiency, sustainability, optimization, predictive analytics, energy consumption, energy storage, grid management, data center optimization, key performance indicators, carbon emissions, resiliency
Procedia PDF Downloads 3311182 Formulation and Evaluation of Silibilin Loaded PLGA Nanoparticles for Cancer Therapy
Authors: Priya Patel, Paresh Patel, Mihir Raval
Abstract:
Silibinin, a flavanone as an antimicrotubular agent used in the treatment of cancer, was encapsulated in nanoparticles (NPs) of poly (lactide-co-glycolide) (PLGA) polymer using the spray-drying technique. The effects of various experimental parameters were optimized by box-behnken experimental design. Production yield, encapsulation efficiency and dissolution study along with characterization by scanning electron microscopy, DSC, FTIR followed by bioavailability study. Particle size and zeta potential were evaluated by using zetatrac particle size analyzer. Experimental design it was evaluated that inlet temperature and polymer concentration influence on the drug release. Feed flow rate impact on particle size. Results showed that spray drying technique yield 149 nm indicate nanosize range. The small size of the nanoparticle resulted in an enhanced cellular entry and greater bioavailability. Entrapment efficiency was found between 89.35% and 98.36%. Zeta potential shows good stability index of nanoparticle formulation. The in vitro release studies indicated the silibinin loaded PLGA nanoparticles provide controlled drug release over a period of 32 h. Pharmacokinetic studies demonstrated that after oral administration of silibinin-loaded PLGA nanoparticles to rats at a dose of 10 mg/kg, relative bioavailability was enhanced about 8.85-fold, compared to silibinin suspension as control hence, this investigation demonstrated the potential of the experimental design in understanding the effect of the formulation variables on the quality of silibinin loaded PLGA nanoparticles. These results describe an effective strategy of silibinin loaded PLGA nanoparticles and might provide a promising approach against the cancer.Keywords: silibinin, cancer, nanoparticles, PLGA, bioavailability
Procedia PDF Downloads 42911181 Nonlinear Multivariable Analysis of CO2 Emissions in China
Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu
Abstract:
This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis
Procedia PDF Downloads 40311180 Design and Optimization of a Small Hydraulic Propeller Turbine
Authors: Dario Barsi, Marina Ubaldi, Pietro Zunino, Robert Fink
Abstract:
A design and optimization procedure is proposed and developed to provide the geometry of a high efficiency compact hydraulic propeller turbine for low head. For the preliminary design of the machine, classic design criteria, based on the use of statistical correlations for the definition of the fundamental geometric parameters and the blade shapes are used. These relationships are based on the fundamental design parameters (i.e., specific speed, flow coefficient, work coefficient) in order to provide a simple yet reliable procedure. Particular attention is paid, since from the initial steps, on the correct conformation of the meridional channel and on the correct arrangement of the blade rows. The preliminary geometry thus obtained is used as a starting point for the hydrodynamic optimization procedure, carried out using a CFD calculation software coupled with a genetic algorithm that generates and updates a large database of turbine geometries. The optimization process is performed using a commercial approach that solves the turbulent Navier Stokes equations (RANS) by exploiting the axial-symmetric geometry of the machine. The geometries generated within the database are therefore calculated in order to determine the corresponding overall performance. In order to speed up the optimization calculation, an artificial neural network (ANN) based on the use of an objective function is employed. The procedure was applied for the specific case of a propeller turbine with an innovative design of a modular type, specific for applications characterized by very low heads. The procedure is tested in order to verify its validity and the ability to automatically obtain the targeted net head and the maximum for the total to total internal efficiency.Keywords: renewable energy conversion, hydraulic turbines, low head hydraulic energy, optimization design
Procedia PDF Downloads 15011179 Dual Metal Organic Framework Derived N-Doped Fe3C Nanocages Decorated with Ultrathin ZnIn2S4 Nanosheets for Efficient Photocatalytic Hydrogen Generation
Authors: D. Amaranatha Reddy
Abstract:
Highly efficient and stable co-catalysts materials is of great important for boosting photo charge carrier’s separation, transportation efficiency, and accelerating the catalytic reactive sites of semiconductor photocatalysts. As a result, it is of decisive importance to fabricate low price noble metal free co-catalysts with high catalytic reactivity, but it remains very challenging. Considering this challenge here, dual metal organic frame work derived N-Doped Fe3C nanocages have been rationally designed and decorated with ultrathin ZnIn2S4 nanosheets for efficient photocatalytic hydrogen generation. The fabrication strategy precisely integrates co-catalyst nanocages with ultrathin two-dimensional (2D) semiconductor nanosheets by providing tightly interconnected nano-junctions and helps to suppress the charge carrier’s recombination rate. Furthermore, constructed highly porous hybrid structures expose ample active sites for catalytic reduction reactions and harvest visible light more effectively by light scattering. As a result, fabricated nanostructures exhibit superior solar driven hydrogen evolution rate (9600 µmol/g/h) with an apparent quantum efficiency of 3.6 %, which is relatively higher than the Pt noble metal co-catalyst systems and earlier reported ZnIn2S4 based nanohybrids. We believe that the present work promotes the application of sulfide based nanostructures in solar driven hydrogen production.Keywords: photocatalysis, water splitting, hydrogen fuel production, solar-driven hydrogen
Procedia PDF Downloads 13411178 The Quality of Working Life and the Organizational Commitment of Municipal Employee in Samut Sakhon Province
Authors: Mananya Meenakorn
Abstract:
This research aims to investigate: (1) Relationship between the quality of working life and organizational commitment of municipal employee in Samut Sakhon Province. (2) To compare the quality of working life and the organizational commitment of municipal employee in Samut Sakhon Province by the gender, age, education, official experience, position, division, and income. This study is a quantitative research; data was collected by questionnaires distributed to the municipal employee in Samut Sakhon province for 241 sample by stratified random sampling. Data was analyzed by descriptive statistic including percentage, mean, standard deviation and inferential statistic including t-test, F-test and Pearson correlation for hypothesis testing. Finding showed that the quality of working life and the organizational commitment of municipal Employee in Samut Sakhon province in terms of compensation and fair has a positive correlation (r = 0.673) and the comparison of the quality of working life and organizational commitment of municipal employees in Samut Sakhon province by gender. We found that the overall difference was statistically significant at the 0.05 level and we also found stability and progress in career path and the characteristics are beneficial to society has a difference was statistically significant at the 0.01 level, and the participation and social acceptance has a difference was statistically significant at the 0.05 level.Keywords: quality of working life, organizational commitment, municipal employee, Samut Sakhon province
Procedia PDF Downloads 290