Search results for: applied Buddhism
2202 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies
Authors: Mark Andrew
Abstract:
Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.Keywords: forecasting, technology futures, uncertainty, complexity
Procedia PDF Downloads 1152201 The Relationship between Spindle Sound and Tool Performance in Turning
Authors: N. Seemuang, T. McLeay, T. Slatter
Abstract:
Worn tools have a direct effect on the surface finish and part accuracy. Tool condition monitoring systems have been developed over a long period and used to avoid a loss of productivity resulting from using a worn tool. However, the majority of tool monitoring research has applied expensive sensing systems not suitable for production. In this work, the cutting sound in turning machine was studied using microphone. Machining trials using seven cutting conditions were conducted until the observable flank wear width (FWW) on the main cutting edge exceeded 0.4 mm. The cutting inserts were removed from the tool holder and the flank wear width was measured optically. A microphone with built-in preamplifier was used to record the machining sound of EN24 steel being face turned by a CNC lathe in a wet cutting condition using constant surface speed control. The sound was sampled at 50 kS/s and all sound signals recorded from microphone were transformed into the frequency domain by FFT in order to establish the frequency content in the audio signature that could be then used for tool condition monitoring. The extracted feature from audio signal was compared to the flank wear progression on the cutting inserts. The spectrogram reveals a promising feature, named as ‘spindle noise’, which emits from the main spindle motor of turning machine. The spindle noise frequency was detected at 5.86 kHz of regardless of cutting conditions used on this particular CNC lathe. Varying cutting speed and feed rate have an influence on the magnitude of power spectrum of spindle noise. The magnitude of spindle noise frequency alters in conjunction with the tool wear progression. The magnitude increases significantly in the transition state between steady-state wear and severe wear. This could be used as a warning signal to prepare for tool replacement or adapt cutting parameters to extend tool life.Keywords: tool wear, flank wear, condition monitoring, spindle noise
Procedia PDF Downloads 3382200 Single Ion Transport with a Single-Layer Graphene Nanopore
Authors: Vishal V. R. Nandigana, Mohammad Heiranian, Narayana R. Aluru
Abstract:
Graphene material has found tremendous applications in water desalination, DNA sequencing and energy storage. Multiple nanopores are etched to create opening for water desalination and energy storage applications. The nanopores created are of the order of 3-5 nm allowing multiple ions to transport through the pore. In this paper, we present for the first time, molecular dynamics study of single ion transport, where only one ion passes through the graphene nanopore. The diameter of the graphene nanopore is of the same order as the hydration layers formed around each ion. Analogous to single electron transport resulting from ionic transport is observed for the first time. The current-voltage characteristics of such a device are similar to single electron transport in quantum dots. The current is blocked until a critical voltage, as the ions are trapped inside a hydration shell. The trapped ions have a high energy barrier compared to the applied input electrical voltage, preventing the ion to break free from the hydration shell. This region is called “Coulomb blockade region”. In this region, we observe zero transport of ions inside the nanopore. However, when the electrical voltage is beyond the critical voltage, the ion has sufficient energy to break free from the energy barrier created by the hydration shell to enter into the pore. Thus, the input voltage can control the transport of the ion inside the nanopore. The device therefore acts as a binary storage unit, storing 0 when no ion passes through the pore and storing 1 when a single ion passes through the pore. We therefore postulate that the device can be used for fluidic computing applications in chemistry and biology, mimicking a computer. Furthermore, the trapped ion stores a finite charge in the Coulomb blockade region; hence the device also acts a super capacitor.Keywords: graphene nanomembrane, single ion transport, Coulomb blockade, nanofluidics
Procedia PDF Downloads 3212199 Strengths and Weaknesses of Tally, an LCA Tool for Comparative Analysis
Authors: Jacob Seddlemeyer, Tahar Messadi, Hongmei Gu, Mahboobeh Hemmati
Abstract:
The main purpose of this first tier of the study is to quantify and compare the embodied environmental impacts associated with alternative materials applied to Adohi Hall, a residence building at the University of Arkansas campus, Fayetteville, AR. This 200,000square foot building has5 stories builtwith mass timber and is compared to another scenario where the same edifice is built with a steel frame. Based on the defined goal and scope of the project, the materials respectivetothe respective to the two building options are compared in terms of Global Warming Potential (GWP), starting from cradle to the construction site, which includes the material manufacturing stage (raw material extract, process, supply, transport, and manufacture) plus transportation to the site (module A1-A4, based on standard EN 15804 definition). The consumedfossil fuels and emitted CO2 associated with the buildings are the major reason for the environmental impacts of climate change. In this study, GWP is primarily assessed to the exclusion of other environmental factors. The second tier of this work is to evaluate Tally’s performance in the decision-making process through the design phases, as well as determine its strengths and weaknesses. Tally is a Life Cycle Assessment (LCA) tool capable of conducting a cradle-to-grave analysis. As opposed to other software applications, Tally is specifically targeted at buildings LCA. As a peripheral application, this software tool is directly run within the core modeling application platform called Revit. This unique functionality causes Tally to stand out from other similar tools in the building sector LCA analysis. The results of this study also provide insights for making more environmentally efficient decisions in the building environment and help in the move forward to reduce Green House Gases (GHGs) emissions and GWP mitigation.Keywords: comparison, GWP, LCA, materials, tally
Procedia PDF Downloads 2262198 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming
Authors: Rui Li, Min Wen, Kim Bang Salling
Abstract:
For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance
Procedia PDF Downloads 4442197 Community, Identity, and Resistance in Minority Literature: Arab American Poets - Samuel Hazo, Nathalie Handal, and Naomi Shihab Nye
Authors: Reem Saad Alqahtani
Abstract:
Drawing on minority literature, this research highlights the role of three contemporary Arab American writers, considering the significance of the historical and cultural contexts of the brutal attacks of 9/11. The focus of the research is to draw attention to the poetry of Samuel Hazo, Nathalie Handal, and Naomi Shihab Nye as representatives of the identity crisis, whose experiences left them feeling marginalized and alienated in both societies, and reflected as one of the ethnic American minority groups, as demonstrated in their poetry, with a special focus on hybridity, resistance, identity, and empowerment. The study explores the writers’ post-9/11 experience, affected by the United States’ long history of marginalization and discrimination against people of colour, placing Arab American literature with that of other ethnic American groups who share the same experience and contribute to composing literature characterized by the aesthetics of cultural hybridity, cultural complexity, and the politics of minorities to promote solidarity and coalition building. Indeed, the three selected Arab American writers have found a link between their narration and the identity of the exiled by establishing an identity that is a kind of synthesis of diverse identities of Western reality and Eastern nostalgia. The approaches applied in this study will include historical/biographical, postcolonial, and discourse analysis. The first will be used to emphasize the influence of the biographical aspects related to the community, identity, and resistance of the three poets on their poetry. The second is used to investigate the effects of postcolonialism on the poets and their responses to it, while the third understand the sociocultural, political, and historical dimensions of the texts, establishing these poets as representative of the Arab American experience. This study is significant because it will help shed light on the importance of the Arabic hybrid identity in creating resistance to minority communities within American society.Keywords: Arab American, identity, hybridity, post-9/11
Procedia PDF Downloads 1682196 Effect of a new Released Bio Organic-Fertilizer in Improving Tomato Growth in Hydroponic System and Under Greenhouse
Authors: Zayneb Kthiri, Walid Hamada
Abstract:
The application of organic fertilizers is generally known to be useful to sustain soil fertility and plant growth, especially in poor soils, with less than 1% of organic matter, as it is very common in our Tunisian fields. Therefore, we focused on evaluating the effect of a new released liquid organic fertilizer named Solorga (with 5% of organic matter) compared to a reference product (Espartan: Kimitec, Spain) on tomato plant growth and physiology. Both fertilizers, derived from plant decomposition, were applied at an early stage in hydroponic system and under greenhouse. In hydroponic system, after 14 days of their application by root feeding, a significant difference was observed between treatments. Indeed, Solorga improved shoots and roots length, as well as the biomass respectively, by 45%, 27%, and 27.8% increase rate, while compared to control plants. However, Espartan induced less the measured parameters while compared to untreated control. Moreover, Solorga significantly increased the chlorophyll content by 42% compared to control and by 32% compared to Espartan. In the greenhouse, after 20 days of treatments, the results showed a significant effect of both fertilizers on SPAD index and the number of flowers blossom. Solorga increased the amount of chlorophyll present in the leaf by 7% compared to Espartan as well as the plant height under greenhouse. Moreover, the number of flowers blossom increased by 15% in plants treated with Solorga while compared to Espartan. Whereas, there is no notable difference between both organic fertilizers on the fruits blossom and the number of fruits per blossom. In conclusion, even though there is a difference in the organic matter between both fertilizers, Solorga improved better the plant growth in controlled conditions in hydroponic system while compared to Espartan. Altogether the obtained results are encouraging for the use of Solorga as a soil enriching source of organic matter to help plants to boost their growth and help them to overcome abiotic stresses linked to soil fertility.Keywords: tomato, plant growth, organic fertilizer, hydroponic system, greenhouse
Procedia PDF Downloads 1392195 Effects of Earthquake Induced Debris to Pedestrian and Community Street Network Resilience
Authors: Al-Amin, Huanjun Jiang, Anayat Ali
Abstract:
Reinforced concrete frames (RC), especially Ordinary RC frames, are prone to structural failures/collapse during seismic events, leading to a large proportion of debris from the structures, which obstructs adjacent areas, including streets. These blocked areas severely impede post-earthquake resilience. This study uses computational simulation (FEM) to investigate the amount of debris generated by the seismic collapse of an ordinary reinforced concrete moment frame building and its effects on the adjacent pedestrian and road network. A three-story ordinary reinforced concrete frame building, primarily designed for gravity load and earthquake resistance, was selected for analysis. Sixteen different ground motions were applied and scaled up until the total collapse of the tested building to evaluate the failure mode under various seismic events. Four types of collapse direction were identified through the analysis, namely aligned (positive and negative) and skewed (positive and negative), with aligned collapse being more predominant than skewed cases. The amount and distribution of debris around the collapsed building were assessed to investigate the interaction between collapsed buildings and adjacent street networks. An interaction was established between a building that collapsed in an aligned direction and the adjacent pedestrian walkway and narrow street located in an unplanned old city. The FEM model was validated against an existing shaking table test. The presented results can be utilized to simulate the interdependency between the debris generated from the collapse of seismic-prone buildings and the resilience of street networks. These findings provide insights for better disaster planning and resilient infrastructure development in earthquake-prone regions.Keywords: building collapse, earthquake-induced debris, ORC moment resisting frame, street network
Procedia PDF Downloads 852194 Semantic Search Engine Based on Query Expansion with Google Ranking and Similarity Measures
Authors: Ahmad Shahin, Fadi Chakik, Walid Moudani
Abstract:
Our study is about elaborating a potential solution for a search engine that involves semantic technology to retrieve information and display it significantly. Semantic search engines are not used widely over the web as the majorities are still in Beta stage or under construction. Many problems face the current applications in semantic search, the major problem is to analyze and calculate the meaning of query in order to retrieve relevant information. Another problem is the ontology based index and its updates. Ranking results according to concept meaning and its relation with query is another challenge. In this paper, we are offering a light meta-engine (QESM) which uses Google search, and therefore Google’s index, with some adaptations to its returned results by adding multi-query expansion. The mission was to find a reliable ranking algorithm that involves semantics and uses concepts and meanings to rank results. At the beginning, the engine finds synonyms of each query term entered by the user based on a lexical database. Then, query expansion is applied to generate different semantically analogous sentences. These are generated randomly by combining the found synonyms and the original query terms. Our model suggests the use of semantic similarity measures between two sentences. Practically, we used this method to calculate semantic similarity between each query and the description of each page’s content generated by Google. The generated sentences are sent to Google engine one by one, and ranked again all together with the adapted ranking method (QESM). Finally, our system will place Google pages with higher similarities on the top of the results. We have conducted experimentations with 6 different queries. We have observed that most ranked results with QESM were altered with Google’s original generated pages. With our experimented queries, QESM generates frequently better accuracy than Google. In some worst cases, it behaves like Google.Keywords: semantic search engine, Google indexing, query expansion, similarity measures
Procedia PDF Downloads 4252193 Process of Analysis, Evaluation and Verification of the 'Real' Redevelopment of the Public Open Space at the Neighborhood’s Stairs: Case Study of Serres, Greece
Authors: Ioanna Skoufali
Abstract:
The present study is directed towards adaptation to climate change closely related to the phenomenon of the urban heat island (UHI). This issue is widespread and common to different urban realities, but particularly in Mediterranean cities that are characterized by dense urban. The attention of this work of redevelopment of the open space is focused on mitigation techniques aiming to solve local problems such as microclimatic parameters and the conditions of thermal comfort in summer, related to urban morphology. This quantitative analysis, evaluation, and verification survey involves the methodological elaboration applied in a real study case by Serres, through the experimental support of the ENVImet Pro V4.1 and BioMet software developed: i) in two phases concerning the anteoperam (phase a1 # 2013) and the post-operam (phase a2 # 2016); ii) in scenario A (+ 25% of green # 2017). The first study tends to identify the main intervention strategies, namely: the application of cool pavements, the increase of green surfaces, the creation of water surface and external fans; moreover, it obtains the minimum results achieved by the National Program 'Bioclimatic improvement project for public open space', EPPERAA (ESPA 2007-2013) related to the four environmental parameters illustrated below: the TAir = 1.5 o C, the TSurface = 6.5 o C, CDH = 30% and PET = 20%. In addition, the second study proposes a greater potential for improvement than postoperam intervention by increasing the vegetation within the district towards the SW/SE. The final objective of this in-depth design is to be transferable in homogeneous cases of urban regeneration processes with obvious effects on the efficiency of microclimatic mitigation and thermal comfort.Keywords: cool pavements, microclimate parameters (TAir, Tsurface, Tmrt, CDH), mitigation strategies, outdoor thermal comfort (PET & UTCI)
Procedia PDF Downloads 2022192 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)
Procedia PDF Downloads 3672191 Agile Software Effort Estimation Using Regression Techniques
Authors: Mikiyas Adugna
Abstract:
Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.Keywords: agile software development, effort estimation, elastic net regression, LASSO
Procedia PDF Downloads 712190 Mobile Application Interventions in Positive Psychology: Current Status and Recommendations for Effective App Design
Authors: Gus Salazar, Jeremy Bekker, Lauren Linford, Jared Warren
Abstract:
Positive psychology practices allow for its principles to be applied to all people, regardless of their current level of functioning. To increase the dissemination of these practices, interventions are being adapted for use with digital technology, such as mobile apps. However, the research regarding positive psychology mobile app interventions is still in its infancy. In an effort to facilitate progress in this important area, we 1) conducted a qualitative review to summarize the current state of the positive psychology mobile app literature and 2) developed research-supported recommendations for positive psychology app development to maximize behavior change. In our literature review, we found that while positive psychology apps varied widely in content and purpose, there was a near-complete lack of research supporting their effectiveness. Most apps provided no rationale for the behavioral change techniques (BCTs) they employed in their app, and most did not develop their app with specific theoretical frameworks or design models in mind. Given this problem, we recommended four steps for effective positive psychology app design. First, developers must establish their app in a research-supported theory of change. Second, researchers must select appropriate behavioral change techniques which are consistent with their app’s goals. Third, researchers must leverage effective design principles. These steps will help mobile applications use data-driven methods for encouraging behavior change in their users. Lastly, we discuss directions for future research. In particular, researchers must investigate the effectiveness of various BCTs in positive psychology interventions. Although there is some research on this point, we do not yet clearly understand the mechanisms within the apps that lead to behavior change. Additionally, app developers must also provide data on the effectiveness of their mobile apps. As developers follow these steps for effective app development and as researchers continue to investigate what makes these apps most effective, we will provide millions of people in need with access to research-based mental health resources.Keywords: behavioral change techniques, mobile app, mobile intervention, positive psychology
Procedia PDF Downloads 2242189 The Impact of Oxytetracycline on the Aquaponic System, Biofilter, and Plants
Authors: Hassan Alhoujeiri, Angele Matrat, Sandra Beaufort, Claire joaniss Cassan, Jerome Silvester
Abstract:
Aquaponics is a sustainable food production technology, and its transition to industrial-scale systems has created several challenges that require further investigation in order to make it a robust process. One of the critical concerns is the potential accumulation of compounds from veterinary treatments, phytosanitary agents, fish feed, or simply from contaminated water sources. The accumulation of these substances could negatively impact fish health, microbial biofilters, and plant growth, thereby disrupting the system’s overall balance and functionality. The lack of legislation and knowledge regarding the presence of such compounds in aquaponic systems raises concerns about their potential impact on both system balance and food safety. In this study, we focused on the effects of oxytetracycline (OTC), an antibiotic commonly used in aquaculture, on both the microbial biofilter and plant growth. Although OTC is rarely applied in aquaponics today, the fish compartment may need to be isolated from the system during treatment, as it inhibits specific bacterial populations, which could affect the microbial biofilter's efficiency. However, questions remain about the aquaponic system's tolerance threshold, particularly in cases of treatment or residual OTC traces post-treatment. This study results indicated a decline in microbial biofilter activity to 20% compared to the control, potentially corresponding to treatments of 41 mg/L of OTC. Analysis of microbial populations in the biofilter, using flow cytometry and microscopy (confocal and scanning electron microscopy), revealed an increase in bacterial mortality without disrupting the microbial biofilm. Additionally, OTC exposure led to noticeable changes in plant morphology (e.g., color) and growth, though it did not fully inhibit development. However, no significant effects were observed on seed germination at the tested concentrations despite a measurable impact on subsequent plant growth.Keywords: aquaponic, oxytetracycline, nitrifying biofilter, plant, micropollutants, sustainability
Procedia PDF Downloads 202188 Establishing a Surrogate Approach to Assess the Exposure Concentrations during Coating Process
Authors: Shan-Hong Ying, Ying-Fang Wang
Abstract:
A surrogate approach was deployed for assessing exposures of multiple chemicals at the selected working area of coating processes and applied to assess the exposure concentration of similar exposed groups using the same chemicals but different formula ratios. For the selected area, 6 to 12 portable photoionization detector (PID) were placed uniformly in its workplace to measure its total VOCs concentrations (CT-VOCs) for 6 randomly selected workshifts. Simultaneously, one sampling strain was placed beside one of these portable PIDs, and the collected air sample was analyzed for individual concentration (CVOCi) of 5 VOCs (xylene, butanone, toluene, butyl acetate, and dimethylformamide). Predictive models were established by relating the CT-VOCs to CVOCi of each individual compound via simple regression analysis. The established predictive models were employed to predict each CVOCi based on the measured CT-VOC for each the similar working area using the same portable PID. Results show that predictive models obtained from simple linear regression analyses were found with an R2 = 0.83~0.99 indicating that CT-VOCs were adequate for predicting CVOCi. In order to verify the validity of the exposure prediction model, the sampling analysis of the above chemical substances was further carried out and the correlation between the measured value (Cm) and the predicted value (Cp) was analyzed. It was found that there is a good correction between the predicted value and measured value of each measured chemical substance (R2=0.83~0.98). Therefore, the surrogate approach could be assessed the exposure concentration of similar exposed groups using the same chemicals but different formula ratios. However, it is recommended to establish the prediction model between the chemical substances belonging to each coater and the direct-reading PID, which is more representative of reality exposure situation and more accurately to estimate the long-term exposure concentration of operators.Keywords: exposure assessment, exposure prediction model, surrogate approach, TVOC
Procedia PDF Downloads 1502187 Effect of Nitrogen and/or Bio-Fertilizer on the Yield, Total Flavonoids, Carbohydrate Contents, Essential Oil Quantity and Constituents of Dill Plants
Authors: Mohammed S. Aly, Abou-Zeid N. El-Shahat, Nabila Y. Naguib, Huussie A. Said-Al Ahl, Atef M. Zakaria, Mohamed A. Abou Dahab
Abstract:
This study was conducted during two successive seasons of 2000/2001 and 2001/2002 to evaluate the response of Anethum graveolens L. plants to nitrogen fertilizer with or without bio-fertilizer on fruits yield, total flavonoids and carbohydrates content, essential oil yield and constituents. Results cleared that the treatment of 60 Kg N/feddan without and with bio-fertilizer gave the highest umbels number per plant through the two seasons and these increments were significant in comparison with control plants. Meanwhile, fruits weight (g/plant) showed significant increase with the treatments of nitrogen fertilizers alone and combined with bio-fertilizers compared with control plants in the first and second season. Maximum increments were resulted with the previous treatment (60 Kg N/fed). Fruits yield (Kg/fed) revealed the same trend of fruits weight (g/plant). Total flavonoids contents were significantly increased with all of used treatments. Maximum increase was noticed with bio-fertilizers combined with 60 Kg N/fed during two seasons. Total carbohydrate contents showed significant increase with applied nitrogen fertilizers treatments as alone, meanwhile total carbohydrate contents were increased non-significantly with the other used treatments during the two seasons in comparison with control plants content. The treatment of bio-fertilizer and in most of nitrogen fertilizer levels significantly increased essential oil percentage, content and yield. The treatment of 60 Kg N/fed with or without bio-fertilizer gave the best values. All identified compounds were observed in the essential oil of all treatments. The major compounds were limonene, carvone and dillapiole. The most effective fertilization on limonene content was 40 Kg N/fed and/or bio-fertilizers. Meanwhile 20 Kg N/fed with or without bio-fertilizers increased carvone, but most of fertilization treatments except those of bio-fertlizers and 40 Kg N/fed increased dillapiole content.Keywords: carbohydrates, dill, essential oil, fertilizer, flavonoids
Procedia PDF Downloads 4192186 Orange Fleshed Sweet Potato Response to Filter Cake and Macadamia Husk Compost in Two Agro-Ecologies of Kwazulu-Natal, South Africa
Authors: Kayode Fatokun, Nozipho N. Motsa
Abstract:
Field experiments were carried out during the summer/autumn (first trial) and winter/spring (second trial) seasons of 2019 and 2021 inDlangubo, Ngwelezane, and Mtubatubaareas of KwaZulu-Natal Province of South Africa to study the drought amelioration effects and impact of 2 locally available organic wastes [filter cake (FC) and macadamia husk compost (MHC)] on the productivity, and physiological responses of 4 orange-fleshed sweet potato cultivars (Buregard cv., Impilo, W-119 and 199062.1). The effects of FC and MHC were compared with that of inorganic fertilizer (IF) [2:3:2 (30)], FC+IF, MHC+IF, and control. The soil amendments were applied in the first trials only. Climatic data such as humidity, temperature, and rainfall were taken via remote sensing. The results of the first trial indicated that filter cake and IF significantly performed better than MHC. While the strength of filter cake may be attributable to its rich array of mineral nutrients such as calcium, magnesium, potassium, sodium, zinc, copper, manganese, iron, and phosphorus. The little performance from MHC may be attributable to its water holding capacity. Also, a positive correction occurred between the yield of the test OFSP cultivars and climatic factors such as rainfall, NDVI, and NDWI values. Whereas the inorganic fertilizer did not have any significant effect on the growth and productivity of any of the tested sweet potato cultivars in the second trial; FC, and MHC largely maintained their significant performances. In conclusion, the use of FC is highly recommended in the production of the test orange-fleshed sweet potato cultivars. Also, the study indicated that both FC and MHC may not only supply the needed plant nutrients but has the capacity to reduce the impact of drought on the growth of the test cultivars. These findings are of great value to farmers, especially the resource-poorones.Keywords: amendments, drought, filter cake, macadamia husk compost, sweet potato
Procedia PDF Downloads 982185 Non-Cognitive Skills Associated with Learning in a Serious Gaming Environment: A Pretest-Posttest Experimental Design
Authors: Tanja Kreitenweis
Abstract:
Lifelong learning is increasingly seen as essential for coping with the rapidly changing work environment. To this end, serious games can provide convenient and straightforward access to complex knowledge for all age groups. However, learning achievements depend largely on a learner’s non-cognitive skill disposition (e.g., motivation, self-belief, playfulness, and openness). With the aim of combining the fields of serious games and non-cognitive skills, this research focuses in particular on the use of a business simulation, which conveys change management insights. Business simulations are a subset of serious games and are perceived as a non-traditional learning method. The presented objectives of this work are versatile: (1) developing a scale, which measures learners’ knowledge and skills level before and after a business simulation was played, (2) investigating the influence of non-cognitive skills on learning in this business simulation environment and (3) exploring the moderating role of team preference in this type of learning setting. First, expert interviews have been conducted to develop an appropriate measure for learners’ skills and knowledge assessment. A pretest-posttest experimental design with German management students was implemented to approach the remaining objectives. By using the newly developed, reliable measure, it was found that students’ skills and knowledge state were higher after the simulation had been played, compared to before. A hierarchical regression analysis revealed two positive predictors for this outcome: motivation and self-esteem. Unexpectedly, playfulness had a negative impact. Team preference strengthened the link between grit and playfulness, respectively, and learners’ skills and knowledge state after completing the business simulation. Overall, the data underlined the potential of business simulations to improve learners’ skills and knowledge state. In addition, motivational factors were found as predictors for benefitting most from the applied business simulation. Recommendations are provided for how pedagogues can use these findings.Keywords: business simulations, change management, (experiential) learning, non-cognitive skills, serious games
Procedia PDF Downloads 1082184 Wasting Human and Computer Resources
Authors: Mária Csernoch, Piroska Biró
Abstract:
The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem-solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text-based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text-based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.Keywords: deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources
Procedia PDF Downloads 3822183 Passenger Preferences on Airline Check-In Methods: Traditional Counter Check-In Versus Common-Use Self-Service Kiosk
Authors: Cruz Queen Allysa Rose, Bautista Joymeeh Anne, Lantoria Kaye, Barretto Katya Louise
Abstract:
The study presents the preferences of passengers on the quality of service provided by the two airline check-in methods currently present in airports-traditional counter check-in and common-use self-service kiosks. Since a study has shown that airlines perceive self-service kiosks alone are sufficient enough to ensure adequate services and customer satisfaction, and in contrast, agents and passengers stated that it alone is not enough and that human interaction is essential. In reference with former studies that established opposing ideas about the choice of the more favorable airline check-in method to employ, it is the purpose of this study to present a recommendation that shall somehow fill-in the gap between the conflicting ideas by means of comparing the perceived quality of service through the RATER model. Furthermore, this study discusses the major competencies present in each method which are supported by the theories–FIRO Theory of Needs upholding the importance of inclusion, control and affection, and the Queueing Theory which points out the discipline of passengers and the length of the queue line as important factors affecting quality service. The findings of the study were based on the data gathered by the researchers from selected Thomasian third year and fourth year college students currently enrolled in the first semester of the academic year 2014-2015, who have already experienced both airline check-in methods through the implication of a stratified probability sampling. The statistical treatments applied in order to interpret the data were mean, frequency, standard deviation, t-test, logistic regression and chi-square test. The final point of the study revealed that there is a greater effect in passenger preference concerning the satisfaction experienced in common-use self-service kiosks in comparison with the application of the traditional counter check-in.Keywords: traditional counter check-in, common-use self-service Kiosks, airline check-in methods
Procedia PDF Downloads 4062182 The Effectiveness of Synthesizing A-Pillar Structures in Passenger Cars
Authors: Chris Phan, Yong Seok Park
Abstract:
The Toyota Camry is one of the best-selling cars in America. It is economical, reliable, and most importantly, safe. These attributes allowed the Camry to be the trustworthy choice when choosing dependable vehicle. However, a new finding brought question to the Camry’s safety. Since 1997, the Camry received a “good” rating on its moderate overlap front crash test through the Insurance Institute of Highway Safety. In 2012, the Insurance Institute of Highway Safety introduced a frontal small overlap crash test into the overall evaluation of vehicle occupant safety test. The 2012 Camry received a “poor” rating on this new test, while the 2015 Camry redeemed itself with a “good” rating once again. This study aims to find a possible solution that Toyota implemented to reduce the severity of a frontal small overlap crash in the Camry during a mid-cycle update. The purpose of this study is to analyze and evaluate the performance of various A-pillar shapes as energy absorbing structures in improving passenger safety in a frontal crash. First, A-pillar structures of the 2012 and 2015 Camry were modeled using CAD software, namely SolidWorks. Then, a crash test simulation using ANSYS software, was applied to the A-pillars to analyze the behavior of the structures in similar conditions. Finally, the results were compared to safety values of cabin intrusion to determine the crashworthy behaviors of both A-pillar structures by measuring total deformation. This study highlights that it is possible that Toyota improved the shape of the A-pillar in the 2015 Camry in order to receive a “good” rating from the IIHS safety evaluation once again. These findings can possibly be used to increase safety performance in future vehicles to decrease passenger injury or fatality.Keywords: A-pillar, Crashworthiness, Design Synthesis, Finite Element Analysis
Procedia PDF Downloads 1192181 Investigation of the Effect of Eye Exercises and Convergence Exercise on Visual Acuity in School-Age Children with Hypermetropia
Authors: Gulay Aras, Isil Kutluturk Karagoz, Z. Candan Algun
Abstract:
Background: Hypermetropia in school-age is a pathology that responds to treatment. In the literature, there has been no study of exercise practice in hypermetropia treatment. Objective: The purpose of this study was to investigate the effects of eye exercises and convergence exercise on visual acuity in school-age children with hypermetropia. Methods: Forty volunteer school-age children with hypermetropia (30 girls, 30 boys, between 7-17 years of age) were included in the study. Sociodemographic information and clinical characteristics were evaluated. 40 participants were randomly divided into two groups: eye exercises and convergence exercises. Home exercise protocols were given to all groups for six weeks, and regular phone calls were made once a week. Individuals performed eye exercises 10 times, convergence exercises 5 min. for two sessions per day for six weeks. The right and left eyes of all the subjects participating in the study were assessed separately by the eye doctor with a Snellen chart. The participants' quality of life was assessed using Pediatric Quality of Life Inventory Version 4.0. Physical health total score (PHTS) and scale total score (STS), which were obtained by evaluating Psychosocial health total score (PSHTS) school, emotional and social functioning, were calculated separately in the scores. At the end of the exercise program, the assessment tests applied at the beginning of the study were reapplied to all individuals. Results: There was no statistically significant difference between the pre- and post-Snellen chart measurements and quality of life in the eye exercises group (p > 0,05). There was a statistically significant difference in visual acuity of right and left eyes (p=0,004, p=0,014) and quality of life in PHTS, PSHTS and STS in the convergence exercise group (p=0,001, p=0,017, p=0,001). Conclusions: In school-age children, convergence exercises were found to be effective on visual acuity and health-related quality of life. Convergence exercises are recommended for the treatment of school-aged children with hypermetropia.Keywords: convergence exercise, eye exercises, hypermetropia, school-age children
Procedia PDF Downloads 2502180 Evaluating Gene-Gene Interaction among Nicotine Dependence Genes on the Risk of Oral Clefts
Authors: Mengying Wang, Dongjing Liu, Holger Schwender, Ping Wang, Hongping Zhu, Tao Wu, Terri H Beaty
Abstract:
Background: Maternal smoking is a recognized risk factor for nonsyndromic cleft lip with or without cleft palate (NSCL/P). It has been reported that the effect of maternal smoking on oral clefts is mediated through genes that influence nicotine dependence. The polymorphisms of cholinergic receptor nicotinic alpha (CHRNA) and beta (CHRNB) subunits genes have previously shown strong associations with nicotine dependence. Here, we attempted to investigate whether the above genes are associated with clefting risk through testing for potential gene-gene (G×G) and gene-environment (G×E) interaction. Methods: We selected 120 markers in 14 genes associated with nicotine dependence to conduct transmission disequilibrium tests among 806 Chinese NSCL/P case-parent trios ascertained in an international consortium which conducted a genome-wide association study (GWAS) of oral clefts. We applied Cordell’s method using “TRIO” package in R to explore G×G as well as G×E interaction involving environmental tobacco smoke (ETS) based on conditional logistic regression model. Results: while no SNP showed significant association with NSCL/P after Bonferroni correction, we found signals for G×G interaction between 10 pairs of SNPs in CHRNA3, CHRNA5, and CHRNB4 (p<10-8), among which the most significant interaction was found between RS3743077 (CHRNA3) and RS11636753 (CHRNB4, p<8.2×10-12). Linkage disequilibrium (LD) analysis revealed only low level of LD between these markers. However, there were no significant results for G×ETS interaction. Conclusion: This study fails to detect association between nicotine dependence genes and NSCL/P, but illustrates the importance of taking into account potential G×G interaction for genetic association analysis in NSCL/P. This study also suggests nicotine dependence genes should be considered as important candidate genes for NSCL/P in future studies.Keywords: Gene-Gene Interaction, Maternal Smoking, Nicotine Dependence, Non-Syndromic Cleft Lip with or without Cleft Palate
Procedia PDF Downloads 3372179 Identifying and Prioritizing Critical Success Factors (Csfs) in Retaining and Developing Knowledge Workers in Oil and Gas Project–Based Companies
Authors: Ehsan Samimi, Mohammaa Ali Shahosseeni, Ali Abasltian, Shahriar Shafaghi
Abstract:
Background/Objectives: Voluntary turnover and early retirement request by specialists and experienced people in project-based organizations (PBO) has caused many problems in finding suitable experts to execute the projects. Methods/Statistical analysis: The present study is a descriptive and applied research. Research population consists of KWs in oil and gas PBO. The engineers in these organizations were considered as research sample. Interviews and questionnaire were used to gather information. Interviews with experts were used to identify factors and questionnaires were utilized to identify the importance and prioritization. 72 factors were identified and categorized into 9 groups within organizational and HR initiative levels. Results: Results of the research indicate the priority of each group of factors according to the proposed model in the view of KWs in oil, gas and petrochemical industries. On this basis, the following factors have the highest effect ratio based on the respondents’ point of view: 1. knowledge management 2. Performance appraisal system 3. Communication 4.Training and development 5.Job design and analysis 6. Employment policies 7. Career planning 8. Project/organizational factors 9. Salary and rewards. Additionally, in each group the priority of effective sub-factors has been identified as the result of the research .The results support the definitions of KWs and influence of factors examined and specified by similar studies in retention and development of KWs. The high importance of knowledge management and low rank for salary and rewards can be mentioned as example in this regard. Despite the priority of each group of factors the uniqueness of the result is due to identification of effective factors in the specific industry (oil and gas) and type of organization (PBO). Conclusion/Application: The findings of present study can be used to devise plans for retaining and developing KWs in PBO especially in oil and gas industry.Keywords: project–based organizations, knowledge workers, HR management, turnover, retaining and developing employees
Procedia PDF Downloads 2912178 Development of Cobalt Doped Alumina Hybrids for Adsorption of Textile Effluents
Authors: Uzaira Rafique, Kousar Parveen
Abstract:
The discharge volume and composition of Textile effluents gains scientific concern due to its hazards and biotoxcity of azo dyes. Azo dyes are non-biodegradable due to its complex molecular structure and recalcitrant nature. Serious attempts have been made to synthesize and develop new materials to combat the environmental problems. The present study is designed for removal of a range of azo dyes (Methyl orange, Congo red and Basic fuchsine) from synthetic aqueous solutions and real textile effluents. For this purpose, Metal (cobalt) doped alumina hybrids are synthesized and applied as adsorbents in the batch experiment. Two different aluminium precursor (aluminium nitrate and spent aluminium foil) and glucose are mixed following sol gel method to get hybrids. The synthesized materials are characterized for surface and bulk properties using FTIR, SEM-EDX and XRD techniques. The characterization of materials under FTIR revealed that –OH (3487-3504 cm-1), C-H (2935-2985 cm-1), Al-O (~ 800 cm-1), Al-O-C (~1380 cm-1), Al-O-Al (659-669 cm-1) groups participates in the binding of dyes onto the surface of hybrids. Amorphous shaped particles and elemental composition of carbon (23%-44%), aluminium (29%-395%), and oxygen (11%-20%) is demonstrated in SEM-EDX micrograph. Time-dependent batch-experiments under identical experimental parameters showed 74% congo red, 68% methyl orange and 85% maximum removal of basic fuchsine onto the surface of cobalt doped alumina hybrids probably through the ion-exchange mechanism. The experimental data when treated with adsorption models is found to have good agreement with pseudo second order kinetic and freundlich isotherm for adsorption process. The present study concludes the successful synthesis of novel and efficient cobalt doped alumina hybrids providing environmental friendly and economical alternative to the commercial adsorbents for the treatment of industrial effluents.Keywords: alumina hybrid, adsorption, dopant, isotherm, kinetic
Procedia PDF Downloads 1932177 Ectoine: A Compatible Solute in Radio-Halophilic Stenotrophomonas sp. WMA-LM19 Strain to Prevent Ultraviolet-Induced Protein Damage
Authors: Wasim Sajjad, Manzoor Ahmad, Sundas Qadir, Muhammad Rafiq, Fariha Hasan, Richard Tehan, Kerry L. McPhail, Aamer Ali Shah
Abstract:
Aim: This study aims to investigate the possible radiation protective role of a compatible solute in the tolerance of radio-halophilic bacterium against stresses, like desiccation and exposure to ionizing radiation. Methods and Results: Nine different radio-resistant bacteria were isolated from desert soil, where strain WMA-LM19 was chosen for detailed studies on the basis of its high tolerance for ultraviolet radiation among all these isolates. 16S rRNA gene sequencing indicated that the bacterium was closely related to Stenotrophomonas sp. (KT008383). A bacterial milking strategy was applied for extraction of intracellular compatible solutes in 70% (v/v) ethanol, which were purified by high-performance liquid chromatography (HPLC). The compound was characterized as ectoine by 1H and 13C nuclear magnetic resonance (NMR), and mass spectrometry (MS). Ectoine demonstrated more efficient preventive activity (54.80%) to erythrocyte membranes and also inhibited oxidative damage to proteins and lipids in comparison to the standard ascorbic acid. Furthermore, a high level of ectoine-mediated protection of bovine serum albumin against ionizing radiation (1500-2000 Jm-2) was observed, as indicated by sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) analysis. Conclusion: The results indicated that ectoine can be used as a potential mitigator and radio-protective agent to overcome radiation- and salinity-mediated oxidative damage in extreme environments. Significance and Impact of the Study: This study shows that ectoine from radio-halophiles can be used as a potential source in topical creams as sunscreen. The investigation of ectoine as UV protectant also changes the prospective that radiation resistance is specific only to molecular adaptation.Keywords: ectoine, anti-oxidant, stenotrophomonas sp., ultraviolet radiation
Procedia PDF Downloads 2092176 Chassis Level Control Using Proportional Integrated Derivative Control, Fuzzy Logic and Deep Learning
Authors: Atakan Aral Ormancı, Tuğçe Arslantaş, Murat Özcü
Abstract:
This study presents the design and implementation of an experimental chassis-level system for various control applications. Specifically, the height level of the chassis is controlled using proportional integrated derivative, fuzzy logic, and deep learning control methods. Real-time data obtained from height and pressure sensors installed in a 6x2 truck chassis, in combination with pulse-width modulation signal values, are utilized during the tests. A prototype pneumatic system of a 6x2 truck is added to the setup, which enables the Smart Pneumatic Actuators to function as if they were in a real-world setting. To obtain real-time signal data from height sensors, an Arduino Nano is utilized, while a Raspberry Pi processes the data using Matlab/Simulink and provides the correct output signals to control the Smart Pneumatic Actuator in the truck chassis. The objective of this research is to optimize the time it takes for the chassis to level down and up under various loads. To achieve this, proportional integrated derivative control, fuzzy logic control, and deep learning techniques are applied to the system. The results show that the deep learning method is superior in optimizing time for a non-linear system. Fuzzy logic control with a triangular membership function as the rule base achieves better outcomes than proportional integrated derivative control. Traditional proportional integrated derivative control improves the time it takes to level the chassis down and up compared to an uncontrolled system. The findings highlight the superiority of deep learning techniques in optimizing the time for a non-linear system, and the potential of fuzzy logic control. The proposed approach and the experimental results provide a valuable contribution to the field of control, automation, and systems engineering.Keywords: automotive, chassis level control, control systems, pneumatic system control
Procedia PDF Downloads 812175 The Changes of Chemical Composition of Rice Straw Treated by a Biodecomposer Developed from Rumen Bacterial of Buffalo
Authors: A. Natsir, M. Nadir, S. Syahrir, A. Mujnisa
Abstract:
In tropical countries such as in Indonesia, rice straw plays an important role in fulfilling the needs of feed for ruminant, especially during the dry season in which the availability of forage is very limited. However, the main problem of using rice straw as a feedstuff is low digestibility due to the existence of the links between lignin and cellulose or hemicellulose, and imbalance of its minerals content. One alternative to solve this problem is by application of biodecomposer (BS) derived from rumen bacterial of the ruminant. This study was designed to assess the effects of BS application on the changes of the chemical composition of rice straw. Four adults local buffalo raised under typical feeding conditions were used as a source of inoculum for BS development. The animal was fed for a month with a diet consisted of rice straw and elephant grass before taking rumen fluid samples. Samples of rumen fluid were inoculated in the carboxymethyl cellulose (CMC) media under anaerobic condition for 48 hours at 37°C. The mixture of CMC media and microbes are ready to be used as a biodecomposer following incubation of the mixture under anaerobic condition for 7 days at 45°C. The effectiveness of BS then assessed by applying the BS on the straw according to completely randomized design consisted of four treatments and three replication. One hundred g of ground coarse rice straw was used as the substrate. The BS was applied to the rice straw substrate with the following composition: Rice straw without BS (P0), rice straw + 5% BS (P1), rice straw +10% BS (P2), and rice straw + 15% BS. The mixture of rice straw and BS then fermented under anaerobic for four weeks. Following the fermentation, the chemical composition of rice straw was evaluated. The results indicated that the crude protein content of rice straw significantly increased (P < 0.05) as the level of BS increased. On the other hand, the concentration of crude fiber of the rice straw was significantly decreased (P < 0.05) as the level of BS increased. Other nutrients such as minerals did not change (P > 0.05) due to the treatments. In conclusion, application of BS developed from rumen bacterial of buffalo has a promising prospect to be used as a biological agent to improve the quality of rice straw as feeding for ruminant.Keywords: biodecomposer, local buffalo, rumen microbial, chemical composition
Procedia PDF Downloads 2082174 The High Quality Colored Wind Chimes by Anodization on Aluminum Alloy
Authors: Chia-Chih Wei, Yun-Qi Li, Ssu-Ying Chen, Hsuan-Jung Chen, Hsi-Wen Yang, Chih-Yuan Chen, Chien-Chon Chen
Abstract:
In this paper we used high quality anodization technique to make colored wind chime with a nano-tube structure anodic film, which controls the length to diameter ratio of an aluminum rod and controls the oxide film structure on the surface of the aluminum rod by anodizing method. The research experiment used hard anodization to grow a controllable thickness of anodic film on aluminum alloy surface. The hard anodization film has high hardness, high insulation, high temperature resistance, good corrosion resistance, colors, and mass production properties can be further applied to transportation, electronic products, biomedical fields, or energy industry applications. This study also in-depth research and detailed discussion in the related process of aluminum alloy surface hard anodizing including pre-anodization, anodization, and post-anodization. The experiment parameters of anodization including using a mixed acid solution of sulfuric acid and oxalic acid as an anodization electrolyte, and control the temperature, time, current density, and final voltage to obtain the anodic film. In the experiments results, the properties of anodic film including thickness, hardness, insulation, and corrosion characteristics, microstructure of the anode film were measured and the hard anodization efficiency was calculated. Thereby obtaining different transmission speeds of sound in the aluminum rod and different audio sounds can be presented on the aluminum rod. Another feature of the present invention is the use of anodizing method dyeing method, laser engraving patterning and electrophoresis method to make colored aluminum wind chimes.Keywords: anodization, colored, high quality, wind chime, nano-tube
Procedia PDF Downloads 2452173 Analyzing the Performance of Different Cost-Based Methods for the Corrective Maintenance of a System in Thermal Power Plants
Authors: Demet Ozgur-Unluakin, Busenur Turkali, S. Caglar Aksezer
Abstract:
Since the age of industrialization, maintenance has always been a very crucial element for all kinds of factories and plants. With today’s increasingly developing technology, the system structure of such facilities has become more complicated, and even a small operational disruption may return huge losses in profits for the companies. In order to reduce these costs, effective maintenance planning is crucial, but at the same time, it is a difficult task because of the complexity of systems. The most important aspect of correct maintenance planning is to understand the structure of the system, not to ignore the dependencies among the components and as a result, to model the system correctly. In this way, it will be better to understand which component improves the system more when it is maintained. Undoubtedly, proactive maintenance at a scheduled time reduces costs because the scheduled maintenance prohibits high losses in profits. But the necessity of corrective maintenance, which directly affects the situation of the system and provides direct intervention when the system fails, should not be ignored. When a fault occurs in the system, if the problem is not solved immediately and proactive maintenance time is awaited, this may result in increased costs. This study proposes various maintenance methods with different efficiency measures under corrective maintenance strategy on a subsystem of a thermal power plant. To model the dependencies between the components, dynamic Bayesian Network approach is employed. The proposed maintenance methods aim to minimize the total maintenance cost in a planning horizon, as well as to find the most appropriate component to be attacked on, which improves the system reliability utmost. Performances of the methods are compared under corrective maintenance strategy. Furthermore, sensitivity analysis is also applied under different cost values. Results show that all fault effect methods perform better than the replacement effect methods and this conclusion is also valid under different downtime cost values.Keywords: dynamic Bayesian networks, maintenance, multi-component systems, reliability
Procedia PDF Downloads 128