Search results for: simple multiple-attribute rating technique
2337 Effect of Injection Moulding Process Parameter on Tensile Strength of Using Taguchi Method
Authors: Gurjeet Singh, M. K. Pradhan, Ajay Verma
Abstract:
The plastic industry plays very important role in the economy of any country. It is generally among the leading share of the economy of the country. Since metals and their alloys are very rarely available on the earth. So to produce plastic products and components, which finds application in many industrial as well as household consumer products is beneficial. Since 50% plastic products are manufactured by injection moulding process. For production of better quality product, we have to control quality characteristics and performance of the product. The process parameters plays a significant role in production of plastic, hence the control of process parameter is essential. In this paper the effect of the parameters selection on injection moulding process has been described. It is to define suitable parameters in producing plastic product. Selecting the process parameter by trial and error is neither desirable nor acceptable, as it is often tends to increase the cost and time. Hence optimization of processing parameter of injection moulding process is essential. The experiments were designed with Taguchi’s orthogonal array to achieve the result with least number of experiments. Here Plastic material polypropylene is studied. Tensile strength test of material is done on universal testing machine, which is produced by injection moulding machine. By using Taguchi technique with the help of MiniTab-14 software the best value of injection pressure, melt temperature, packing pressure and packing time is obtained. We found that process parameter packing pressure contribute more in production of good tensile plastic product.Keywords: injection moulding, tensile strength, poly-propylene, Taguchi
Procedia PDF Downloads 2902336 Developing Laser Spot Position Determination and PRF Code Detection with Quadrant Detector
Authors: Mohamed Fathy Heweage, Xiao Wen, Ayman Mokhtar, Ahmed Eldamarawy
Abstract:
In this paper, we are interested in modeling, simulation, and measurement of the laser spot position with a quadrant detector. We enhance detection and tracking of semi-laser weapon decoding system based on microcontroller. The system receives the reflected pulse through quadrant detector and processes the laser pulses through a processing circuit, a microcontroller decoding laser pulse reflected by the target. The seeker accuracy will be enhanced by the decoding system, the laser detection time based on the receiving pulses number is reduced, a gate is used to limit the laser pulse width. The model is implemented based on Pulse Repetition Frequency (PRF) technique with two microcontroller units (MCU). MCU1 generates laser pulses with different codes. MCU2 decodes the laser code and locks the system at the specific code. The codes EW selected based on the two selector switches. The system is implemented and tested in Proteus ISIS software. The implementation of the full position determination circuit with the detector is produced. General system for the spot position determination was performed with the laser PRF for incident radiation and the mechanical system for adjusting system at different angles. The system test results show that the system can detect the laser code with only three received pulses based on the narrow gate signal, and good agreement between simulation and measured system performance is obtained.Keywords: four quadrant detector, pulse code detection, laser guided weapons, pulse repetition frequency (PRF), Atmega 32 microcontrollers
Procedia PDF Downloads 3952335 A Comparative Analysis Approach Based on Fuzzy AHP, TOPSIS and PROMETHEE for the Selection Problem of GSCM Solutions
Authors: Omar Boutkhoum, Mohamed Hanine, Abdessadek Bendarag
Abstract:
Sustainable economic growth is nowadays driving firms to extend toward the adoption of many green supply chain management (GSCM) solutions. However, the evaluation and selection of these solutions is a matter of concern that needs very serious decisions, involving complexity owing to the presence of various associated factors. To resolve this problem, a comparative analysis approach based on multi-criteria decision-making methods is proposed for adequate evaluation of sustainable supply chain management solutions. In the present paper, we propose an integrated decision-making model based on FAHP (Fuzzy Analytic Hierarchy Process), TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) and PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) to contribute to a better understanding and development of new sustainable strategies for industrial organizations. Due to the varied importance of the selected criteria, FAHP is used to identify the evaluation criteria and assign the importance weights for each criterion, while TOPSIS and PROMETHEE methods employ these weighted criteria as inputs to evaluate and rank the alternatives. The main objective is to provide a comparative analysis based on TOPSIS and PROMETHEE processes to help make sound and reasoned decisions related to the selection problem of GSCM solution.Keywords: GSCM solutions, multi-criteria analysis, decision support system, TOPSIS, FAHP, PROMETHEE
Procedia PDF Downloads 1662334 Developing a Spatial Decision Support System for Rationality Assessment of Land Use Planning Locations in Thai Binh Province, Vietnam
Authors: Xuan Linh Nguyen, Tien Yin Chou, Yao Min Fang, Feng Cheng Lin, Thanh Van Hoang, Yin Min Huang
Abstract:
In Vietnam, land use planning is the most important and powerful tool of the government for sustainable land use and land management. Nevertheless, many of land use planning locations are facing protests from surrounding households due to environmental impacts. In addition, locations are planned completely based on the subjective decisions of planners who are unsupported by tools or scientific methods. Hence, this research aims to assist the decision-makers in evaluating the rationality of planning locations by developing a Spatial Decision Support System (SDSS) using approaches of Geographic Information System (GIS)-based technology, Analytic Hierarchy Process (AHP) multi-criteria-based technique and Fuzzy set theory. An ArcGIS Desktop add-ins named SDSS-LUPA was developed to support users analyzing data and presenting results in friendly format. The Fuzzy-AHP method has been utilized as analytic model for this SDSS. There are 18 planned locations in Hung Ha district (Thai Binh province, Vietnam) as a case study. The experimental results indicated that the assessment threshold higher than 0.65 while the 18 planned locations were irrational because of close to residential areas or close to water sources. Some potential sites were also proposed to the authorities for consideration of land use planning changes.Keywords: analytic hierarchy process, fuzzy set theory, land use planning, spatial decision support system
Procedia PDF Downloads 3832333 Photoluminescence and Spectroscopic Studies of Tm3+ Ions Doped Lead Tungsten Tellurite Glasses for Visible Red and Near-Ir Laser Applications
Authors: M. Venkateswarlu, Srinivasa Rao Allam, S. K. Mahamuda, K. Swapna, G. Vijaya Prakash
Abstract:
Lead Tungsten Tellurite (LTT) glasses doped with different concentrations of Tm3+ ions were prepared by using melt quenching technique and characterized through optical absorption, photoluminescence and decay spectral studies to know the feasibility of using these glasses as luminescent devices in visible Red and NIR regions. By using optical absorption spectral data, the energy band gaps for all the glasses were evaluated and were found to be in the range of 2.34-2.59 eV; which is very useful for the construction of optical devices. Judd-Ofelt (J-O)theory has been applied to the optical absorption spectral profiles to calculate the J-O intensity parameters Ωλ (λ=2, 4 and 6) and consecutively used to evaluate various radiative properties such as radiative transition probability (AR), radiative lifetimes (τ_R) and branching ratios (β_R) for the prominent luminescent levels. The luminescence spectra for all the LTT glass samples have shown two intense peaks in bright red and Near Infrared regions at 650 nm (1G4→3F4) and 800 nm (3H4→3H6) respectively for which effective bandwidths (〖Δλ〗_P), experimental branching ratios (β_exp) and stimulated emission cross-sections (σ_se) are evaluated. The decay profiles for all the glasses were also recorded to measure the quantum efficiency of the prepared LTT glasses by coupling the radiative and experimental lifetimes. From the measured emission cross-sections, quantum efficiency and CIE chromaticity coordinates, it was found that 0.5 mol% of Tm3+ ions doped LTT glass is most suitable for generating bright visible red and NIR lasers to operate at 650 and 800 nm respectively.Keywords: glasses, JO parameters, optical materials, thullium
Procedia PDF Downloads 2552332 Design of a Virtual Reality System for Children with Developmental Coordination Disorder
Authors: Ya-Ju Ju, Li-Chen Yang, Yi-Chun Du, Rong-Ju Cherng
Abstract:
Introduction: It is estimated that 5-6% of school-aged children may be diagnosed to have developmental coordination disorder (DCD). Children with DCD are characterized with motor skill difficulty which cannot be explained by any medical or intellectual reasons. Such motor difficulties limit children’s participation to sports activity, further affect their physical fitness, cardiopulmonary function and balance, and may lead to obesity. The purpose of the project was to develop an exergaming system for children with DCD aiming to improve their physical fitness, cardiopulmonary function and balance ability. Methods: This study took five steps to build up the system: system planning, tasks selection, tasks programming, system integration and usability test. The system basically adopted virtual reality technique to integrate self-developed training programs. The training programs were developed to brainstorm among team members and after literature review. The selected tasks for training in the system were a combination of fundamental movement tor skill. Results and Discussion: Based on the theory of motor development, we design the training task from easy ones to hard ones, from single tasks to dual tasks. The tasks included walking, sit to stand, jumping, kicking, weight shifting, side jumping and their combination. Preliminary study showed that the tasks presented an order of development. Further study is needed to examine its effect on motor skill and cardiovascular fitness in children with DCD.Keywords: virtual reality, virtual reality system, developmental coordination disorder, children
Procedia PDF Downloads 1162331 The Phenomenology in the Music of Debussy through Inspiration of Western and Oriental Culture
Authors: Yu-Shun Elisa Pong
Abstract:
Music aesthetics related to phenomenology is rarely discussed and still in the ascendant while multi-dimensional discourses of philosophy were emerged to be an important trend in the 20th century. In the present study, a basic theory of phenomenology from Edmund Husserl (1859-1938) is revealed and discussed followed by the introduction of intentionality concepts, eidetic reduction, horizon, world, and inter-subjectivity issues. Further, phenomenology of music and general art was brought to attention by the introduction of Roman Ingarden’s The Work of Music and the Problems of its Identity (1933) and Mikel Dufrenne’s The Phenomenology of Aesthetic Experience (1953). Finally, Debussy’s music will be analyzed and discussed from the perspective of phenomenology. Phenomenology is not so much a methodology or analytics rather than a common belief. That is, as much as possible to describe in detail the different human experience, relative to the object of purpose. Such idea has been practiced in various guises for centuries, only till the early 20th century Phenomenology was better refined through the works of Husserl, Heidegger, Sartre, Merleau-Ponty and others. Debussy was born in an age when the Western society began to accept the multi-cultural baptism. With his unusual sensitivity to the oriental culture, Debussy has presented considerable inspiration, absorption, and echo in his music works. In fact, his relationship with nature is far from echoing the idea of Chinese ancient literati and nature. Although he is not the first composer to associate music with human and nature, the unique quality and impact of his works enable him to become a significant figure in music aesthetics. Debussy’s music tried to develop a quality analogous of nature, and more importantly, based on vivid life experience and artistic transformation to achieve the realm of pure art. Such idea that life experience comes before artwork, either clear or vague, simple or complex, was later presented abstractly in his late works is still an interesting subject worth further discussion. Debussy’s music has existed for more than or close to a century. It has received musicology researcher’s attention as much as other important works in the history of Western music. Among the pluralistic discussion about Debussy’s art and ideas, phenomenological aesthetics has enlightened new ideas and view angles to relook his great works and even gave some previous arguments legitimacy. Overall, this article provides a new insight of Debussy’s music from phenomenological exploration and it is believed phenomenology would be an important pathway in the research of the music aesthetics.Keywords: Debussy's music, music esthetics, oriental culture, phenomenology
Procedia PDF Downloads 2792330 In-Farm Wood Gasification Energy Micro-Generation System in Brazil: A Monte Carlo Viability Simulation
Authors: Erich Gomes Schaitza, Antônio Francisco Savi, Glaucia Aparecida Prates
Abstract:
The penetration of renewable energy into the electricity supply in Brazil is high, one of the highest in the World. Centralized hydroelectric generation is the main source of energy, followed by biomass and wind. Surprisingly, mini and micro-generation are negligible, with less than 2,000 connections to the national grid. In 2015, a new regulatory framework was put in place to change this situation. In the agricultural sector, the framework was complemented by the offer of low interest rate loans to in-farm renewable generation. Brazil proposed to more than double its area of planted forests as part of its INDC- Intended Nationally Determined Contributions to the UNFCCC-U.N. Framework Convention on Climate Change (UNFCCC). This is an ambitious target which will be achieved only if forests are attractive to farmers. Therefore, this paper analyses whether planting forests for in-farm energy generation with a with a woodchip gasifier is economically viable for microgeneration under the new framework and at if they could be an economic driver for forest plantation. At first, a static case was analyzed with data from Eucalyptus plantations in five farms. Then, a broader analysis developed with the use of Monte Carlo technique. Planting short rotation forests to generate energy could be a viable alternative and the low interest loans contribute to that. There are some barriers to such systems such as the inexistence of a mature market for small scale equipment and of a reference network of good practices and examples.Keywords: biomass, distribuited generation, small-scale, Monte Carlo
Procedia PDF Downloads 2892329 Perceived Effects of Work-Family Balance on Employee’s Job Satisfaction among Extension Agents in Southwest Nigeria
Authors: B. G. Abiona, A. A. Onaseso, T. D. Odetayo, J. Yila, O. E. Fapojuwo, K. G. Adeosun
Abstract:
This study determines the perceived effects of work-family balance on employees’ job satisfaction among Extension Agents in the Agricultural Development Programme (ADP) in southwest Nigeria. A multistage sampling technique was used to select 256 respondents for the study. Data on personal characteristics, work-family balance domain, and job satisfaction were collected. The collected data were analysed using descriptive statistics, Chi-square, Pearson Product Moment Correlation (PPMC), multiple linear regression, and Student T-test. Results revealed that the mean age of the respondents was 40 years; the majority (59.3%) of the respondents were male, and slightly above half (51.6%) of the respondents had MSc as their highest academic qualification. Findings revealed that turnover intention (x ̅ = 3.20) and work-role conflict (x ̅ = 3.06) were the major perceived work-family balance domain in the studied areas. Further, the result showed that the respondents have a high (79%) level of job satisfaction. Multiple linear regression revealed that job involvement (ß=0.167, p<0.01) and work-role conflict (ß= -0.221, p<0.05) contributed significantly to employees’ level of job satisfaction. The results of the Student T-test revealed a significant difference in the perceived work-family balance domain (t = 0.43, p<0.05) between the two studied areas. The study concluded that work-role conflict among employees causes work-family imbalance and, therefore, negatively affects employees’ job satisfaction. The definition of job design among the respondents that will create a balance between work and family is highly recommended.Keywords: work-life, conflict, job satisfaction, extension agent
Procedia PDF Downloads 972328 Assessment of Bisphenol A and 17 α-Ethinyl Estradiol Bioavailability in Soils Treated with Biosolids
Authors: I. Ahumada, L. Ascar, C. Pedraza, J. Montecino
Abstract:
It has been found that the addition of biosolids to soil is beneficial to soil health, enriching soil with essential nutrient elements. Although this sludge has properties that allow for the improvement of the physical features and productivity of agricultural and forest soils and the recovery of degraded soils, they also contain trace elements, organic trace and pathogens that can cause damage to the environment. The application of these biosolids to land without the total reclamation and the treated wastewater can transfer these compounds into terrestrial and aquatic environments, giving rise to potential accumulation in plants. The general aim of this study was to evaluate the bioavailability of bisphenol A (BPA), and 17 α-ethynyl estradiol (EE2) in a soil-biosolid system using wheat (Triticum aestivum) plant assays and a predictive extraction method using a solution of hydroxypropyl-β-cyclodextrin (HPCD) to determine if it is a reliable surrogate for this bioassay. Two soils were obtained from the central region of Chile (Lo Prado and Chicauma). Biosolids were obtained from a regional wastewater treatment plant. The soils were amended with biosolids at 90 Mg ha-1. Soils treated with biosolids, spiked with 10 mgkg-1 of the EE2 and 15 mgkg-1 and 30 mgkg-1of BPA were also included. The BPA, and EE2 concentration were determined in biosolids, soils and plant samples through ultrasound assisted extraction, solid phase extraction (SPE) and gas chromatography coupled to mass spectrometry determination (GC/MS). The bioavailable fraction found of each one of soils cultivated with wheat plants was compared with results obtained through a cyclodextrin biosimulator method. The total concentration found in biosolid from a treatment plant was 0.150 ± 0.064 mgkg-1 and 12.8±2.9 mgkg-1 of EE2 and BPA respectively. BPA and EE2 bioavailability is affected by the organic matter content and the physical and chemical properties of the soil. The bioavailability response of both compounds in the two soils varied with the EE2 and BPA concentration. It was observed in the case of EE2, the bioavailability in wheat plant crops contained higher concentrations in the roots than in the shoots. The concentration of EE2 increased with increasing biosolids rate. On the other hand, for BPA, a higher concentration was found in the shoot than the roots of the plants. The predictive capability the HPCD extraction was assessed using a simple linear correlation test, for both compounds in wheat plants. The correlation coefficients for the EE2 obtained from the HPCD extraction with those obtained from the wheat plants were r= 0.99 and p-value ≤ 0.05. On the other hand, in the case of BPA a correlation was not found. Therefore, the methodology was validated with respect to wheat plants bioassays, only in the EE2 case. Acknowledgments: The authors thank FONDECYT 1150502.Keywords: emerging compounds, bioavailability, biosolids, endocrine disruptors
Procedia PDF Downloads 1492327 Cancer Survivor’s Adherence to Healthy Lifestyle Behaviours; Meeting the World Cancer Research Fund/American Institute of Cancer Research Recommendations, a Systematic Review and Meta-Analysis
Authors: Daniel Nigusse Tollosa, Erica James, Alexis Hurre, Meredith Tavener
Abstract:
Introduction: Lifestyle behaviours such as healthy diet, regular physical activity and maintaining a healthy weight are essential for cancer survivors to improve the quality of life and longevity. However, there is no study that synthesis cancer survivor’s adherence to healthy lifestyle recommendations. The purpose of this review was to collate existing data on the prevalence of adherence to healthy behaviours and produce the pooled estimate among adult cancer survivors. Method: Multiple databases (Embase, Medline, Scopus, Web of Science and Google Scholar) were searched for relevant articles published since 2007, reporting cancer survivors adherence to more than two lifestyle behaviours based on the WCRF/AICR recommendations. The pooled prevalence of adherence to single and multiple behaviours (operationalized as adherence to more than 75% (3/4) of health behaviours included in a particular study) was calculated using a random effects model. Subgroup analysis adherence to multiple behaviours was undertaken corresponding to the mean survival years and year of publication. Results: A total of 3322 articles were generated through our search strategies. Of these, 51 studies matched our inclusion criteria, which presenting data from 2,620,586 adult cancer survivors. The highest prevalence of adherence was observed for smoking (pooled estimate: 87%, 95% CI: 85%, 88%) and alcohol intake (pooled estimate 83%, 95% CI: 81%, 86%), and the lowest was for fiber intake (pooled estimate: 31%, 95% CI: 21%, 40%). Thirteen studies were reported the proportion of cancer survivors (all used a simple summative index method) to multiple healthy behaviours, whereby the prevalence of adherence was ranged from 7% to 40% (pooled estimate 23%, 95% CI: 17% to 30%). Subgroup analysis suggest that short-term survivors ( < 5 years survival time) had relatively a better adherence to multiple behaviours (pooled estimate: 31%, 95% CI: 27%, 35%) than long-term ( > 5 years survival time) cancer survivors (pooled estimate: 25%, 95% CI: 14%, 36%). Pooling of estimates according to the year of publication (since 2007) also suggests an increasing trend of adherence to multiple behaviours over time. Conclusion: Overall, the adherence to multiple lifestyle behaviors was poor (not satisfactory), and relatively, it is a major concern for long-term than the short-term cancer survivor. Cancer survivors need to obey with healthy lifestyle recommendations related to physical activity, fruit and vegetable, fiber, red/processed meat and sodium intake.Keywords: adherence, lifestyle behaviours, cancer survivors, WCRF/AICR
Procedia PDF Downloads 1852326 Towards Creative Movie Title Generation Using Deep Neural Models
Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie
Abstract:
Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.Keywords: creativity, deep machine learning, natural language generation, movies
Procedia PDF Downloads 3272325 Comparison of the Factor of Safety and Strength Reduction Factor Values from Slope Stability Analysis of a Large Open Pit
Authors: James Killian, Sarah Cox
Abstract:
The use of stability criteria within geotechnical engineering is the way the results of analyses are conveyed, and sensitivities and risk assessments are performed. Historically, the primary stability criteria for slope design has been the Factor of Safety (FOS) coming from a limit calculation. Increasingly, the value derived from Strength Reduction Factor (SRF) analysis is being used as the criteria for stability analysis. The purpose of this work was to study in detail the relationship between SRF values produced from a numerical modeling technique and the traditional FOS values produced from Limit Equilibrium (LEM) analyses. This study utilized a model of a 3000-foot-high slope with a 45-degree slope angle, assuming a perfectly plastic mohr-coulomb constitutive model with high cohesion and friction angle values typical of a large hard rock mine slope. A number of variables affecting the values of the SRF in a numerical analysis were tested, including zone size, in-situ stress, tensile strength, and dilation angle. This paper demonstrates that in most cases, SRF values are lower than the corresponding LEM FOS values. Modeled zone size has the greatest effect on the estimated SRF value, which can vary as much as 15% to the downside compared to FOS. For consistency when using SRF as a stability criteria, the authors suggest that numerical model zone sizes should not be constructed to be smaller than about 1% of the overall problem slope height and shouldn’t be greater than 2%. Future work could include investigations of the effect of anisotropic strength assumptions or advanced constitutive models.Keywords: FOS, SRF, LEM, comparison
Procedia PDF Downloads 3132324 Antimicrobial Evaluation of Polyphenon 60 and Ciprofloxacin Loaded Nano Emulsion against Uropathogenic Escherichia coli Bacteria and Its in vivo Analysis
Authors: Atinderpal Kaur, Shweta Dang
Abstract:
Our aim is to develop a nanoemulsion-based delivery system containing polyphenon 60 (P60) and ciprofloxacin (Cipro) for intravaginal delivery to treat urinary tract infection. In the present study Polyphenon 60 (P60) and ciprofloxacin (Cipro) were loaded in a single nano emulsion (NE) system via ultra-sonication technique and characterized for particle size, in vitro release and antibacterial efficacy against Bcl-2 level Escherichia coli bacteria. To determine in vivo pharmacokinetic parameters and intravaginal transportation of NE, gamma scintigraphy and biodistribution study was conducted by radiolabelling NE with technetium pertechnetate (99mTc). The preliminary antibacterial investigation showed synergy between these compounds with FICindex of 0.42. The developed formulation showed zeta potential +55.3 and particle size of 151.7 nm, with PDI of 0.196. The in vitro release percentage of P60 at the end of 7th hours was 94.8 ± 0.9 % whereas the release for Cipro was 75.1± 0.15 % in simulated vaginal media. MBC was identified and the findings demonstrated that in both ESBL (Extended Spectrum β- lactamase) and MBL (Metallo β- lactamase) cultures the P60+Cipro NE showed inhibition of growth of all the isolates at 2 mg/ml dilutions. The percentage per gram of radiolabelled drug was found (3.50±0.26) and (3.81±0.30) in kidney and urinary bladder, respectively at 3 h. From the findings, it was concluded that the developed P60+Cipro NE was transported efficiently throughout the target organs, had long duration of action and high biocompatibility via intravaginal administration as compared to oral administration.Keywords: ciprofloxacin, gamma scintigraphy, intravaginal drug delivery, Polyphenon 60
Procedia PDF Downloads 3222323 Bartlett Factor Scores in Multiple Linear Regression Equation as a Tool for Estimating Economic Traits in Broilers
Authors: Oluwatosin M. A. Jesuyon
Abstract:
In order to propose a simpler tool that eliminates the age-long problems associated with the traditional index method for selection of multiple traits in broilers, the Barttlet factor regression equation is being proposed as an alternative selection tool. 100 day-old chicks each of Arbor Acres (AA) and Annak (AN) broiler strains were obtained from two rival hatcheries in Ibadan Nigeria. These were raised in deep litter system in a 56-day feeding trial at the University of Ibadan Teaching and Research Farm, located in South-west Tropical Nigeria. The body weight and body dimensions were measured and recorded during the trial period. Eight (8) zoometric measurements namely live weight (g), abdominal circumference, abdominal length, breast width, leg length, height, wing length and thigh circumference (all in cm) were recorded randomly from 20 birds within strain, at a fixed time on the first day of the new week respectively with a 5-kg capacity Camry scale. These records were analyzed and compared using completely randomized design (CRD) of SPSS analytical software, with the means procedure, Factor Scores (FS) in stepwise Multiple Linear Regression (MLR) procedure for initial live weight equations. Bartlett Factor Score (BFS) analysis extracted 2 factors for each strain, termed Body-length and Thigh-meatiness Factors for AA, and; Breast Size and Height Factors for AN. These derived orthogonal factors assisted in deducing and comparing trait-combinations that best describe body conformation and Meatiness in experimental broilers. BFS procedure yielded different body conformational traits for the two strains, thus indicating the different economic traits and advantages of strains. These factors could be useful as selection criteria for improving desired economic traits. The final Bartlett Factor Regression equations for prediction of body weight were highly significant with P < 0.0001, R2 of 0.92 and above, VIF of 1.00, and DW of 1.90 and 1.47 for Arbor Acres and Annak respectively. These FSR equations could be used as a simple and potent tool for selection during poultry flock improvement, it could also be used to estimate selection index of flocks in order to discriminate between strains, and evaluate consumer preference traits in broilers.Keywords: alternative selection tool, Bartlet factor regression model, consumer preference trait, linear and body measurements, live body weight
Procedia PDF Downloads 2042322 Calcium ion Cross-linked HEC/SA/HA hydrogel: Fabrication, Characterization and Wound Healing Applications
Authors: Fathima Shahitha, Alqasim Al-Mamari, Mohammed Al-Sibani, Ahmed Al Harrasi
Abstract:
The aim of this study is to prepare a novel antibacterial wound healing hydrogel based on hydroxyethyl cellulose/ Sodium alginate/ hyaluronic acid (HEC/SA/HA) and Ag nanoparticles, which is cross-linked via Ca2+ ions. The aim of the study is to obtain a hydrogel compound using green chemistry that helps to heal the wound faster and better. The properties and structure of the hydrogel have been tested to include swelling ratio, vitro degradation, antibacterial and antifungal activity and wound healing tests. It was also characterized via UV-Vis, FTIR, TEM, TGA and tested after it was fabricated by freeze-drying technique. The characteristic peak of UV-Vis spectra revealed the formation of AgNPs in the compound at 411 nm. The FTIR curves showed new peaks that confirmed the oxidation of HEC and also showed the chemical interaction of the three polymers with AgNPs and Ca2+. The TEM images presented monodispersed of AgNPs with their size ranging ( 8.2 to 32 nm ). The results from these studies showed that the hydrogel has an excellent performance in swelling ratio and vitro degradation. Furthermore, the wound healing activity of the hydrogel was examined via measuring the closure of wound and the second group treated with hydrogel revealed a significant healing activity compared to the control group. The hydrogel activity against bacteria and fungi was also measures for 72 h and the results showed excellent performance. These results suggested that the cross-linked hydrogel based on (HEC/HA/SA) with AgNPs might be a promising dressing for wounds.Keywords: hydrogels, wound healing, hydroxyethyl cellulose, sodium alginate, Ca2+ cross-linking, hyaluronic acid
Procedia PDF Downloads 142321 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor
Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira
Abstract:
The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis
Procedia PDF Downloads 802320 Automatic Classification of Lung Diseases from CT Images
Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari
Abstract:
Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification
Procedia PDF Downloads 1612319 Numerical Investigation of Pressure Drop and Erosion Wear by Computational Fluid Dynamics Simulation
Authors: Praveen Kumar, Nitin Kumar, Hemant Kumar
Abstract:
The modernization of computer technology and commercial computational fluid dynamic (CFD) simulation has given better detailed results as compared to experimental investigation techniques. CFD techniques are widely used in different field due to its flexibility and performance. Evaluation of pipeline erosion is complex phenomenon to solve by numerical arithmetic technique, whereas CFD simulation is an easy tool to resolve that type of problem. Erosion wear behaviour due to solid–liquid mixture in the slurry pipeline has been investigated using commercial CFD code in FLUENT. Multi-phase Euler-Lagrange model was adopted to predict the solid particle erosion wear in 22.5° pipe bend for the flow of bottom ash-water suspension. The present study addresses erosion prediction in three dimensional 22.5° pipe bend for two-phase (solid and liquid) flow using finite volume method with standard k-ε turbulence, discrete phase model and evaluation of erosion wear rate with varying velocity 2-4 m/s. The result shows that velocity of solid-liquid mixture found to be highly dominating parameter as compared to solid concentration, density, and particle size. At low velocity, settling takes place in the pipe bend due to low inertia and gravitational effect on solid particulate which leads to high erosion at bottom side of pipeline.Keywords: computational fluid dynamics (CFD), erosion, slurry transportation, k-ε Model
Procedia PDF Downloads 4102318 Computational Fluid Dynamics (CFD) Simulation Approach for Developing New Powder Dispensing Device
Authors: Revanth Rallapalli
Abstract:
Manually dispensing solids and powders can be difficult as it requires gradually pour and check the amount on the scale to be dispensed. Current systems are manual and non-continuous in nature and are user-dependent and difficult to control powder dispensation. Recurrent dosing of powdered medicines in precise amounts quickly and accurately has been an all-time challenge. Various new powder dispensing mechanisms are being designed to overcome these challenges. A battery-operated screw conveyor mechanism is being innovated to overcome the above problems faced. These inventions are numerically evaluated at the concept development level by employing Computational Fluid Dynamics (CFD) of gas-solids multiphase flow systems. CFD has been very helpful in development of such devices saving time and money by reducing the number of prototypes and testing. Furthermore, this paper describes a simulation of powder dispensation from the trocar’s end by considering the powder as secondary flow in air, is simulated by using the technique called Dense Discrete Phase Model incorporated with Kinetic Theory of Granular Flow (DDPM-KTGF). By considering the volume fraction of powder as 50%, the transportation of powder from the inlet side to trocar’s end side is done by rotation of the screw conveyor. Thus, the performance is calculated for a 1-sec time frame in an unsteady computation manner. This methodology will help designers in developing design concepts to improve the dispensation and also at the effective area within a quick turnaround time frame.Keywords: DDPM-KTGF, gas-solids multiphase flow, screw conveyor, Unsteady
Procedia PDF Downloads 1832317 Investigating Spatial Disparities in Health Status and Access to Health-Related Interventions among Tribals in Jharkhand
Authors: Parul Suraia, Harshit Sosan Lakra
Abstract:
Indigenous communities represent some of the most marginalized populations globally, with India labeled as tribals, experiencing particularly pronounced marginalization and a concerning decline in their numbers. These communities often inhabit geographically challenging regions characterized by low population densities, posing significant challenges to providing essential infrastructure services. Jharkhand, a Schedule 5 state, is infamous for its low-level health status due to disparities in access to health care. The primary objective of this study is to investigate the spatial inequalities in healthcare accessibility among tribal populations within the state and pinpoint critical areas requiring immediate attention. Health indicators were selected based on the tribal perspective and association of Sustainable Goal 3 (Good Health and Wellbeing) with other SDGs. Focused group discussions in which tribal people and tribal experts were done in order to finalize the indicators. Employing Principal Component Analysis, two essential indices were constructed: the Tribal Health Index (THI) and the Tribal Health Intervention Index (THII). Index values were calculated based on the district-wise secondary data for Jharkhand. The bivariate spatial association technique, Moran’s I was used to assess the spatial pattern of the variables to determine if there is any clustering (positive spatial autocorrelation) or dispersion (negative spatial autocorrelation) of values across Jharkhand. The results helped in facilitating targeting policy interventions in deprived areas of Jharkhand.Keywords: tribal health, health spatial disparities, health status, Jharkhand
Procedia PDF Downloads 992316 Automated Video Surveillance System for Detection of Suspicious Activities during Academic Offline Examination
Authors: G. Sandhya Devi, G. Suvarna Kumar, S. Chandini
Abstract:
This research work aims to develop a system that will analyze and identify students who indulge in malpractices/suspicious activities during the course of an academic offline examination. Automated Video Surveillance provides an optimal solution which helps in monitoring the students and identifying the malpractice event immediately. This work is organized into three modules. The first module deals with performing an impersonation check using a PCA-based face recognition method which is done by cross checking his profile with the database. The presence or absence of the student is even determined in this module by implementing an image registration technique wherein a grid is formed by considering all the images registered using the frontal camera at the determined positions. Second, detecting such facial malpractices in which a student gets involved in conversation with another, trying to obtain unauthorized information etc., based on the threshold range evaluated by considering his/her mouth state whether open or closed. The third module deals with identification of unauthorized material or gadgets used in the examination hall by training the positive samples of the object through various stages. Here, a top view camera feed is analyzed to detect the suspicious activities. The system automatically alerts the administration when any suspicious activities are identified, thereby reducing the error rate caused due to manual monitoring. This work is an improvement over our previous work published in identifying suspicious activities done by examinees in an offline examination.Keywords: impersonation, image registration, incrimination, object detection, threshold evaluation
Procedia PDF Downloads 2312315 Considering Uncertainties of Input Parameters on Energy, Environmental Impacts and Life Cycle Costing by Monte Carlo Simulation in the Decision Making Process
Authors: Johannes Gantner, Michael Held, Matthias Fischer
Abstract:
The refurbishment of the building stock in terms of energy supply and efficiency is one of the major challenges of the German turnaround in energy policy. As the building sector accounts for 40% of Germany’s total energy demand, additional insulation is key for energy efficient refurbished buildings. Nevertheless the energetic benefits often the environmental and economic performances of insulation materials are questioned. The methods Life Cycle Assessment (LCA) as well as Life Cycle Costing (LCC) can form the standardized basis for answering this doubts and more and more become important for material producers due efforts such as Product Environmental Footprint (PEF) or Environmental Product Declarations (EPD). Due to increasing use of LCA and LCC information for decision support the robustness and resilience of the results become crucial especially for support of decision and policy makers. LCA and LCC results are based on respective models which depend on technical parameters like efficiencies, material and energy demand, product output, etc.. Nevertheless, the influence of parameter uncertainties on lifecycle results are usually not considered or just studied superficially. Anyhow the effect of parameter uncertainties cannot be neglected. Based on the example of an exterior wall the overall lifecycle results are varying by a magnitude of more than three. As a result simple best case worst case analyses used in practice are not sufficient. These analyses allow for a first rude view on the results but are not taking effects into account such as error propagation. Thereby LCA practitioners cannot provide further guidance for decision makers. Probabilistic analyses enable LCA practitioners to gain deeper understanding of the LCA and LCC results and provide a better decision support. Within this study, the environmental and economic impacts of an exterior wall system over its whole lifecycle are illustrated, and the effect of different uncertainty analysis on the interpretation in terms of resilience and robustness are shown. Hereby the approaches of error propagation and Monte Carlo Simulations are applied and combined with statistical methods in order to allow for a deeper understanding and interpretation. All in all this study emphasis the need for a deeper and more detailed probabilistic evaluation based on statistical methods. Just by this, misleading interpretations can be avoided, and the results can be used for resilient and robust decisions.Keywords: uncertainty, life cycle assessment, life cycle costing, Monte Carlo simulation
Procedia PDF Downloads 2872314 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation
Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez
Abstract:
With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling
Procedia PDF Downloads 2002313 Synthetic Classicism: A Machine Learning Approach to the Recognition and Design of Circular Pavilions
Authors: Federico Garrido, Mostafa El Hayani, Ahmed Shams
Abstract:
The exploration of the potential of artificial intelligence (AI) in architecture is still embryonic, however, its latent capacity to change design disciplines is significant. 'Synthetic Classism' is a research project that questions the underlying aspects of classically organized architecture not just in aesthetic terms but also from a geometrical and morphological point of view, intending to generate new architectural information using historical examples as source material. The main aim of this paper is to explore the uses of artificial intelligence and machine learning algorithms in architectural design while creating a coherent narrative to be contained within a design process. The purpose is twofold: on one hand, to develop and train machine learning algorithms to produce architectural information of small pavilions and on the other, to synthesize new information from previous architectural drawings. These algorithms intend to 'interpret' graphical information from each pavilion and then generate new information from it. The procedure, once these algorithms are trained, is the following: parting from a line profile, a synthetic 'front view' of a pavilion is generated, then using it as a source material, an isometric view is created from it, and finally, a top view is produced. Thanks to GAN algorithms, it is also possible to generate Front and Isometric views without any graphical input as well. The final intention of the research is to produce isometric views out of historical information, such as the pavilions from Sebastiano Serlio, James Gibbs, or John Soane. The idea is to create and interpret new information not just in terms of historical reconstruction but also to explore AI as a novel tool in the narrative of a creative design process. This research also challenges the idea of the role of algorithmic design associated with efficiency or fitness while embracing the possibility of a creative collaboration between artificial intelligence and a human designer. Hence the double feature of this research, both analytical and creative, first by synthesizing images based on a given dataset and then by generating new architectural information from historical references. We find that the possibility of creatively understand and manipulate historic (and synthetic) information will be a key feature in future innovative design processes. Finally, the main question that we propose is whether an AI could be used not just to create an original and innovative group of simple buildings but also to explore the possibility of fostering a novel architectural sensibility grounded on the specificities on the architectural dataset, either historic, human-made or synthetic.Keywords: architecture, central pavilions, classicism, machine learning
Procedia PDF Downloads 1432312 Localization of Frontal and Temporal Speech Areas in Brain Tumor Patients by Their Structural Connections with Probabilistic Tractography
Authors: B.Shukir, H.Woo, P.Barzo, D.Kis
Abstract:
Preoperative brain mapping in tumors involving the speech areas has an important role to reduce surgical risks. Functional magnetic resonance imaging (fMRI) is the gold standard method to localize cortical speech areas preoperatively, but its availability in clinical routine is difficult. Diffusion MRI based probabilistic tractography is available in head MRI. It’s used to segment cortical subregions by their structural connectivity. In our study, we used probabilistic tractography to localize the frontal and temporal cortical speech areas. 15 patients with left frontal tumor were enrolled to our study. Speech fMRI and diffusion MRI acquired preoperatively. The standard automated anatomical labelling atlas 3 (AAL3) cortical atlas used to define 76 left frontal and 118 left temporal potential speech areas. 4 types of tractography were run according to the structural connection of these regions to the left arcuate fascicle (FA) to localize those cortical areas which have speech functions: 1, frontal through FA; 2, frontal with FA; 3, temporal to FA; 4, temporal with FA connections were determined. Thresholds of 1%, 5%, 10% and 15% applied. At each level, the number of affected frontal and temporal regions by fMRI and tractography were defined, the sensitivity and specificity were calculated. At the level of 1% threshold showed the best results. Sensitivity was 61,631,4% and 67,1523,12%, specificity was 87,210,4% and 75,611,37% for frontal and temporal regions, respectively. From our study, we conclude that probabilistic tractography is a reliable preoperative technique to localize cortical speech areas. However, its results are not feasible that the neurosurgeon rely on during the operation.Keywords: brain mapping, brain tumor, fMRI, probabilistic tractography
Procedia PDF Downloads 1672311 Nanoimprinted-Block Copolymer-Based Porous Nanocone Substrate for SERS Enhancement
Authors: Yunha Ryu, Kyoungsik Kim
Abstract:
Raman spectroscopy is one of the most powerful techniques for chemical detection, but the low sensitivity originated from the extremely small cross-section of the Raman scattering limits the practical use of Raman spectroscopy. To overcome this problem, Surface Enhanced Raman Scattering (SERS) has been intensively studied for several decades. Because the SERS effect is mainly induced from strong electromagnetic near-field enhancement as a result of localized surface plasmon resonance of metallic nanostructures, it is important to design the plasmonic structures with high density of electromagnetic hot spots for SERS substrate. One of the useful fabrication methods is using porous nanomaterial as a template for metallic structure. Internal pores on a scale of tens of nanometers can be strong EM hotspots by confining the incident light. Also, porous structures can capture more target molecules than non-porous structures in a same detection spot thanks to the large surface area. Herein we report the facile fabrication method of porous SERS substrate by integrating solvent-assisted nanoimprint lithography and selective etching of block copolymer. We obtained nanostructures with high porosity via simple selective etching of the one microdomain of the diblock copolymer. Furthermore, we imprinted of the nanocone patterns into the spin-coated flat block copolymer film to make three-dimensional SERS substrate for the high density of SERS hot spots as well as large surface area. We used solvent-assisted nanoimprint lithography (SAIL) to reduce the fabrication time and cost for patterning BCP film by taking advantage of a solvent which dissolves both polystyrenre and poly(methyl methacrylate) domain of the block copolymer, and thus block copolymer film was molded under the low temperature and atmospheric pressure in a short time. After Ag deposition, we measured Raman intensity of dye molecules adsorbed on the fabricated structure. Compared to the Raman signals of Ag coated solid nanocone, porous nanocone showed 10 times higher Raman intensity at 1510 cm(-1) band. In conclusion, we fabricated porous metallic nanocone arrays with high density electromagnetic hotspots by templating nanoimprinted diblock copolymer with selective etching and demonstrated its capability as an effective SERS substrate.Keywords: block copolymer, porous nanostructure, solvent-assisted nanoimprint, surface-enhanced Raman spectroscopy
Procedia PDF Downloads 6272310 The Relationship between Operating Condition and Sludge Wasting of an Aerobic Suspension-Sequencing Batch Reactor (ASSBR) Treating Phenolic Wastewater
Authors: Ali Alattabi, Clare Harris, Rafid Alkhaddar, Ali Alzeyadi
Abstract:
Petroleum refinery wastewater (PRW) can be considered as one of the most significant source of aquatic environmental pollution. It consists of oil and grease along with many other toxic organic pollutants. In recent years, a new technique was implemented using different types of membranes and sequencing batch reactors (SBRs) to treat PRW. SBR is a fill and draw type sludge system which operates in time instead of space. Many researchers have optimised SBRs’ operating conditions to obtain maximum removal of undesired wastewater pollutants. It has gained more importance mainly because of its essential flexibility in cycle time. It can handle shock loads, requires less area for operation and easy to operate. However, bulking sludge or discharging floating or settled sludge during the draw or decant phase with some SBR configurations are still one of the problems of SBR system. The main aim of this study is to develop and innovative design for the SBR optimising the process variables to result is a more robust and efficient process. Several experimental tests will be developed to determine the removal percentages of chemical oxygen demand (COD), Phenol and nitrogen compounds from synthetic PRW. Furthermore, the dissolved oxygen (DO), pH and oxidation-reduction potential (ORP) of the SBR system will be monitored online to ensure a good environment for the microorganisms to biodegrade the organic matter effectively.Keywords: petroleum refinery wastewater, sequencing batch reactor, hydraulic retention time, Phenol, COD, mixed liquor suspended solids (MLSS)
Procedia PDF Downloads 2652309 Optimization of Personnel Selection Problems via Unconstrained Geometric Programming
Authors: Vildan Kistik, Tuncay Can
Abstract:
From a business perspective, cost and profit are two key factors for businesses. The intent of most businesses is to minimize the cost to maximize or equalize the profit, so as to provide the greatest benefit to itself. However, the physical system is very complicated because of technological constructions, rapid increase of competitive environments and similar factors. In such a system it is not easy to maximize profits or to minimize costs. Businesses must decide on the competence and competence of the personnel to be recruited, taking into consideration many criteria in selecting personnel. There are many criteria to determine the competence and competence of a staff member. Factors such as the level of education, experience, psychological and sociological position, and human relationships that exist in the field are just some of the important factors in selecting a staff for a firm. Personnel selection is a very important and costly process in terms of businesses in today's competitive market. Although there are many mathematical methods developed for the selection of personnel, unfortunately the use of these mathematical methods is rarely encountered in real life. In this study, unlike other methods, an exponential programming model was established based on the possibilities of failing in case the selected personnel was started to work. With the necessary transformations, the problem has been transformed into unconstrained Geometrical Programming problem and personnel selection problem is approached with geometric programming technique. Personnel selection scenarios for a classroom were established with the help of normal distribution and optimum solutions were obtained. In the most appropriate solutions, the personnel selection process for the classroom has been achieved with minimum cost.Keywords: geometric programming, personnel selection, non-linear programming, operations research
Procedia PDF Downloads 2742308 Financial Capacity, Governance, and Corporate Engagement in Environmental Protection
Authors: Lubica Hikkerova, Jean-Michel Sahut
Abstract:
Environmental protection remains a global challenge but, since 2012, there has been a progressive decline in corporate engagement in environmental protection issues. This study seeks to investigate the role of financial capacity and governance in improving the level of environmental engagement of companies. The regression technique is applied to data on 351 large European companies from the ASSET4-ESG database for the 2007-2015 period. Firstly, the results show that the companies in the sample are fairly engaged in environmental protection, with a strong dispersion representing nearly four times the average. This means that the companies in the sample do not share the same level of engagement in matters of environmental protection, some being more committed than others. Secondly, the results reveal that the financial capacity of the company, as assessed through its indicators, has a significant effect on its level of environmental protection engagement in the present sample. This effect is more positive the higher the profits the company makes, and more negative the more heavily indebted or, the higher the rates of dividends it pays per share. Lastly, the results also show that a better quality of governance plays an important role in the decision to undertake actions leading to environmental protection. More specifically, the degree of management implication in the running of the business, the respect of the rights of the shareholders, the effectiveness of the control exerted by the board of directors, and, to a lesser extent, the independence of the audit committee, are variables which have a positive and significant influence on the level of environmental engagement of companies.Keywords: financial capacity, corporate governance, environmental engagement, stakeholder theory, theory of organizational legitimacy, theory of resources and capabilities
Procedia PDF Downloads 189