Search results for: process modeling advancements
12141 Material Handling Equipment Selection Using Fuzzy AHP Approach
Authors: Priyanka Verma, Vijaya Dixit, Rishabh Bajpai
Abstract:
This research paper is aimed at selecting appropriate material handling equipment among the given choices so that the automation level in material handling can be enhanced. This work is a practical case scenario of material handling systems in consumer electronic appliances manufacturing organization. The choices of material handling equipment among which the decision has to be made are Automated Guided Vehicle’s (AGV), Autonomous Mobile Robots (AMR), Overhead Conveyer’s (OC) and Battery Operated Trucks/Vehicle’s (BOT). There is a need of attaining a certain level of automation in order to reduce human interventions in the organization. This requirement of achieving certain degree of automation can be attained by material handling equipment’s mentioned above. The main motive for selecting above equipment’s for study was solely based on corporate financial strategy of investment and return obtained through that investment made in stipulated time framework. Since the low cost automation with respect to material handling devices has to be achieved hence these equipment’s were selected. Investment to be done on each unit of this equipment is less than 20 lakh rupees (INR) and the recovery period is less than that of five years. Fuzzy analytic hierarchic process (FAHP) is applied here for selecting equipment where the four choices are evaluated on basis of four major criteria’s and 13 sub criteria’s, and are prioritized on the basis of weight obtained. The FAHP used here make use of triangular fuzzy numbers (TFN). The inability of the traditional AHP in order to deal with the subjectiveness and impreciseness in the pair-wise comparison process has been improved in the FAHP. The range of values for general rating purposes for all decision making parameters is kept between 0 and 1 on the basis of expert opinions captured on shop floor. These experts were familiar with operating environment and shop floor activity control. Instead of generating exact value the FAHP generates the ranges of values to accommodate the uncertainty in decision-making process. The four major criteria’s selected for the evaluation of choices of material handling equipment’s available are materials, technical capabilities, cost and other features. The thirteen sub criteria’s listed under these following four major criteria’s are weighing capacity, load per hour, material compatibility, capital cost, operating cost and maintenance cost, speed, distance moved, space required, frequency of trips, control required, safety and reliability issues. The key finding shows that among the four major criteria selected, cost is emerged as the most important criteria and is one of the key decision making aspect on the basis of which material equipment selection is based on. While further evaluating the choices of equipment available for each sub criteria it is found that AGV scores the highest weight in most of the sub-criteria’s. On carrying out complete analysis the research shows that AGV is the best material handling equipment suiting all decision criteria’s selected in FAHP and therefore it is beneficial for the organization to carry out automated material handling in the facility using AGV’s.Keywords: fuzzy analytic hierarchy process (FAHP), material handling equipment, subjectiveness, triangular fuzzy number (TFN)
Procedia PDF Downloads 43612140 Micropollutant Carbamazepine: Its Occurrences, Toxicological Effects, and Possible Degradation Methods (Review)
Authors: Azad Khalid, Sifa Dogan
Abstract:
Because of its persistence in conventional treatment plants and broad prevalence in water bodies, the pharmaceutical chemical carbamazepine (CBZ) has been suggested as an anthropogenic marker to evaluate water quality. This study provides a thorough examination of the origins and occurrences of CBZ in water bodies, as well as the drug's toxicological effects and laws. Given CBZ's well-documented negative consequences on the human body when used medicinally, cautious monitoring in water is advised. CBZ residues in drinking water may enter embryos and newborns via intrauterine exposure or breast-feeding, causing congenital abnormalities and/or neurodevelopmental issues over time. The insufficiency of solo solutions was shown after an in-depth technical study of traditional and sophisticated treatment technologies. Nanofiltration and reverse osmosis membranes are more successful at removing CBZ than traditional activated sludge and membrane bioreactor techniques. Recent research has shown that severe chemical cleaning, which is essential to prevent membrane fouling, may lower long-term removal efficiency. Furthermore, despite the efficacy of activated carbon adsorption and advanced oxidation processes, a few issues such as chemical cost and activated carbon renewal must be carefully examined. Individual technology constraints lead to the benefits of combined and hybrid systems, namely the heterogeneous advanced oxidation process.Keywords: carbamazepine, occurrence, toxicity, conventical treatment, advanced oxidation process (AOPs)
Procedia PDF Downloads 10012139 Insights of Interaction Studies between HSP-60, HSP-70 Proteins and HSF-1 in Bubalus bubalis
Authors: Ravinder Singh, C Rajesh, Saroj Badhan, Shailendra Mishra, Ranjit Singh Kataria
Abstract:
Heat shock protein 60 and 70 are crucial chaperones that guide appropriate folding of denatured proteins under heat stress conditions. HSP60 and HSP70 provide assistance in correct folding of a multitude of denatured proteins. The heat shock factors are the family of some transcription factors which controls the regulation of gene expression of proteins involved in folding of damaged or improper folded proteins during stress conditions. Under normal condition heat shock proteins bind with HSF-1 and act as its repressor as well as aids in maintaining the HSF-1’s nonactive and monomeric confirmation. The experimental protein structure for all these proteins in Bubalus bubalis is not known till date. Therefore computational approach was explored to identify three-dimensional structure analysis of all these proteins. In this study, an extensive in silico analysis has been performed including sequence comparison among species to comparative modeling of Bubalus bubalis HSP60, HSP70 and HSF-1 protein. The stereochemical properties of proteins were assessed by utilizing several scrutiny bioinformatics tools to ensure model accuracy. Further docking approach was used to study interactions between Heat shock proteins and HSF-1.Keywords: Bubalus bubalis, comparative modelling, docking, heat shock protein
Procedia PDF Downloads 32412138 Rising Velocity of a Non-Newtonian Liquids in Capillary Tubes
Authors: Reza Sabbagh, Linda Hasanovich, Aleksey Baldygin, David S. Nobes, Prashant R. Waghmare
Abstract:
The capillary filling process is significantly important to study for numerous applications such as the under filling of the material in electronic packaging or liquid hydrocarbons seepage through porous structure. The approximation of the fluid being Newtonian, i.e., linear relationship between the shear stress and deformation rate cannot be justified in cases where the extent of non-Newtonian behavior of liquid governs the surface driven transport, i.e., capillarity action. In this study, the capillary action of a non-Newtonian fluid is not only analyzed, but also the modified generalized theoretical analysis for the capillary transport is proposed. The commonly observed three regimes: surface forces dominant (travelling air-liquid interface), developing flow (viscous force dominant), and developed regimes (interfacial, inertial and viscous forces are comparable) are identified. The velocity field along each regime is quantified with Newtonian and non-Newtonian fluid in square shaped vertically oriented channel. Theoretical understanding of capillary imbibition process, particularly in the case of Newtonian fluids, is relied on the simplified assumption of a fully developed velocity profile which has been revisited for developing a modified theory for the capillary transport of non-Newtonian fluids. Furthermore, the development of the velocity profile from the entrance regime to the developed regime, for different power law fluids, is also investigated theoretically and experimentally.Keywords: capillary, non-Newtonian flow, shadowgraphy, rising velocity
Procedia PDF Downloads 20912137 The Influence of Superordinate Identity and Group Size on Group Decision Making through Discussion
Authors: Lin Peng, Jin Zhang, Yuanyuan Miao, Quanquan Zheng
Abstract:
Group discussion and group decision-making have long been a topic of research interest. Traditional research on group decision making typically focuses on the strategies or functional models of combining members’ preferences to reach an optimal consensus. In this research, we want to explore natural process group decision making through discussion and examine relevant, influential factors--common superordinate identity shared by group and size of the groups. We manipulated the social identity of the groups into either a shared superordinate identity or different subgroup identities. We also manipulated the size to make it either a big (6-8 person) group or small group (3-person group). Using experimental methods, we found members of a superordinate identity group tend to modify more of their own opinions through the discussion, compared to those only identifying with their subgroups. Besides, members of superordinate identity groups also formed stronger identification with group decision--the results of group discussion than their subgroup peers. We also found higher member modification in bigger groups compared to smaller groups. Evaluations of decisions before and after discussion as well as group decisions are strongly linked to group identity, as members of superordinate group feel more confident and satisfied with both the results and decision-making process. Members’ opinions are more similar and homogeneous in smaller groups compared to bigger groups. This research have many implications for further research and applied behaviors in organizations.Keywords: group decision making, group size, identification, modification, superordinate identity
Procedia PDF Downloads 30912136 Role of Vitamin D in Osseointegration of Dental Implant
Authors: Pouya Khaleghi
Abstract:
Dental implants are a successful treatment modality for restoring both function and aesthetics. Dental implant treatment has predictive results in the replacement of the lost teeth and has a high success rate even in the long term. The most important factor which is responsible for the positive course of implant treatment is the process of osseointegration between the implant structure and the host’s bone tissue. During recent years, many studies have focused on surgical and prosthetic factors, as well as the implant-related factors. However, implant failure still occurs despite the improvements that have led to the increased survival rate of dental implants, which suggests the possible role of some host-related risk factors. Vitamin D is a fat-soluble vitamin regulating calcium and phosphorus metabolism in tissues. The role of vitamin D in bone healing has been under investigation for several years. Vitamin D deficiency has also been associated with impaired and delayed callus formation and fractures healing; however, the role of vitamin D has not been clarified. Therefore, it is extremely important to study the phenomenon of a connection formed between bone tissue and the surface of a titanium implant and find correlations between the 25- hydroxycholecalciferol concentration in blood serum and the course of osseointegration. Because the processes of bone remodeling are very dynamic in the period of actual osseointegration, it is necessary to obtain the correct concentration of vitamin D3 metabolites in blood serum. In conclusion, the correct level of 25-hydroxycholecalciferol on the day of surgery and vitamin D deficiency treatment have a significant influence on the increase in the bone level at the implant site during the process of osseointegration assessed radiologically.Keywords: implant, osseointegration, vitamin d, dental
Procedia PDF Downloads 17812135 User Acceptance Criteria for Digital Libraries
Authors: Yu-Ming Wang, Jia-Hong Jian
Abstract:
The Internet and digital publication technologies have brought dramatic impacts on how people collect, organize, disseminate, access, store, and use information. More and more governments, schools, and organizations spent huge funds to develop digital libraries. A digital library can be regarded as a web extension of traditional physically libraries. People can search diverse publications, find out the position of knowledge resources, and borrow or buy publications through digital libraries. People can gain knowledge and students or employees can finish their reports by using digital libraries. Since the considerable funds and energy have been invested in implementing digital libraries, it is important to understand the evaluative criteria from the users’ viewpoint in order to enhance user acceptance. This study develops a list of user acceptance criteria for digital libraries. An initial criteria list was developed based on some previously validated instruments related to digital libraries. Data were collected from user experiences of digital libraries. The exploratory factor analysis and confirmatory factor analysis were adopted to purify the criteria list. The reliabilities and validities were tested. After validating the criteria list, a user survey was conducted to collect the comparative importance of criteria. The analytic hierarchy process (AHP) method was utilized to derive the importance of each criterion. The results of this study contribute to an e understanding of the criteria and relative importance that users evaluate for digital libraries.Keywords: digital library, user acceptance, analytic hierarchy process, factor analysis
Procedia PDF Downloads 25712134 The Latency-Amplitude Binomial of Waves Resulting from the Application of Evoked Potentials for the Diagnosis of Dyscalculia
Authors: Maria Isabel Garcia-Planas, Maria Victoria Garcia-Camba
Abstract:
Recent advances in cognitive neuroscience have allowed a step forward in perceiving the processes involved in learning from the point of view of the acquisition of new information or the modification of existing mental content. The evoked potentials technique reveals how basic brain processes interact to achieve adequate and flexible behaviours. The objective of this work, using evoked potentials, is to study if it is possible to distinguish if a patient suffers a specific type of learning disorder to decide the possible therapies to follow. The methodology used, is the analysis of the dynamics of different areas of the brain during a cognitive activity to find the relationships between the different areas analyzed in order to better understand the functioning of neural networks. Also, the latest advances in neuroscience have revealed the existence of different brain activity in the learning process that can be highlighted through the use of non-invasive, innocuous, low-cost and easy-access techniques such as, among others, the evoked potentials that can help to detect early possible neuro-developmental difficulties for their subsequent assessment and cure. From the study of the amplitudes and latencies of the evoked potentials, it is possible to detect brain alterations in the learning process specifically in dyscalculia, to achieve specific corrective measures for the application of personalized psycho pedagogical plans that allow obtaining an optimal integral development of the affected people.Keywords: dyscalculia, neurodevelopment, evoked potentials, Learning disabilities, neural networks
Procedia PDF Downloads 14712133 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26812132 Role of NaCl and Temperature in Glycerol Mediated Rapid Growth of Silver Nanostructures
Authors: L. R. Shobin, S. Manivannan
Abstract:
One dimensional silver nanowires and nanoparticles gained more interest in developing transparent conducting films, catalysis, biological and chemical sensors. Silver nanostructures can be synthesized by varying reaction conditions such as the precursor concentration, molar ratio of the surfactant, injection speed of silver ions, etc. in the polyol process. However, the reaction proceeds for greater than 2 hours for the formation of silver nanowires. The introduction of etchant in the medium promotes the growth of silver nanowires from silver nanoparticles along the [100] direction. Rapid growth of silver nanowires is accomplished using the Cl- ions from NaCl and polyvinyl pyrrolidone (PVP) as surfactant. The role of Cl- ion was investigated in the growth of the nanostructured silver. Silver nanoparticles (<100 nm) were harvested from glycerol medium in the absence of Cl- ions. Trace amount of Cl- ions (2.5 mM -NaCl) produced the edge joined nanowires of length upto 2 μm and width ranging from 40 to 65 nm. Formation and rapid growth (within 25 minutes) of long, uniform silver nanowires (upto 5 μm) with good yield were realized in the presence of 5 mM NaCl at 200ºC. The growth of nanostructures was monitored by UV-vis-NIR spectroscopy. Scanning and transmission electron microscopes reveal the morphology of the silver nano harvests. The role of temperature in the reduction of silver ions, growth mechanism for nanoparticles, edge joined and straight nanowires will be discussed.Keywords: silver nanowires, glycerol mediated polyol process, scanning electron microscopy, UV-Vis- NIR spectroscopy, transmission electron microscopy
Procedia PDF Downloads 30512131 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data
Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani
Abstract:
Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry
Procedia PDF Downloads 22412130 The Models of Character Development Bali Police to Improve Quality of Moral Members in Bali Police Headquarters
Authors: Agus Masrukhin
Abstract:
This research aims to find and analyze the model of character building in the Police Headquarters in Bali with a case study of Muslim members in improving the quality of the morality of its members. The formation of patterns of thinking, behavior, mentality, and police officers noble character, later can be used as a solution to reduce the hedonistic nature of the challenges in the era of globalization. The benefit of this study is expected to be a positive recommendation to find a constructive character building models of police officers in the Republic of Indonesia, especially Bali Police. For the long term, the discovery of the character building models can be developed for the entire police force in Indonesia. The type of research that would apply in this study researchers mix the qualitative research methods based on the narrative between the subject and the concrete experience of field research and quantitative research methods with 92 respondents from the police regional police Bali. This research used a descriptive analysis and SWOT analysis then it is presented in the FGD (focus group discussion). The results of this research indicate that the variable modeling the leadership of the police and variable police offices culture have significant influence on the implementation of spiritual development.Keywords: positive constructive, hedonistic, character models, morality
Procedia PDF Downloads 37012129 Optimization of Multistage Extractor for the Butanol Separation from Aqueous Solution Using Ionic Liquids
Authors: Dharamashi Rabari, Anand Patel
Abstract:
n-Butanol can be regarded as a potential biofuel. Being resistive to corrosion and having high calorific value, butanol is a very attractive energy source as opposed to ethanol. By fermentation process called ABE (acetone, butanol, ethanol), bio-butanol can be produced. ABE carried out mostly by bacteria Clostridium acetobutylicum. The major drawback of the process is the butanol concentration higher than 10 g/L, delays the growth of microbes resulting in a low yield. It indicates the simultaneous separation of butanol from the fermentation broth. Two hydrophobic Ionic Liquids (ILs) 1-butyl-1-methylpiperidinium bis (trifluoromethylsulfonyl)imide [bmPIP][Tf₂N] and 1-hexyl-3-methylimidazolium bis (trifluoromethylsulfonyl)imide [hmim][Tf₂N] were chosen. The binary interaction parameters for both ternary systems i.e. [bmPIP][Tf₂N] + water + n-butanol and [hmim][Tf₂N] + water +n-butanol were taken from the literature that was generated by NRTL model. Particle swarm optimization (PSO) with the isothermal sum rate (ISR) method was used to optimize the cost of liquid-liquid extractor. For [hmim][Tf₂N] + water +n-butanol system, PSO shows 84% success rate with the number of stages equal to eight and solvent flow rate equal to 461 kmol/hr. The number of stages was three with 269.95 kmol/hr solvent flow rate for [bmPIP][Tf₂N] + water + n-butanol system. Moreover, both ILs were very efficient as the loss of ILs in raffinate phase was negligible.Keywords: particle swarm optimization, isothermal sum rate method, success rate, extraction
Procedia PDF Downloads 12812128 Application of Powder Metallurgy Technologies for Gas Turbine Engine Wheel Production
Authors: Liubov Magerramova, Eugene Kratt, Pavel Presniakov
Abstract:
A detailed analysis has been performed for several schemes of Gas Turbine Wheels production based on additive and powder technologies including metal, ceramic, and stereolithography 3-D printing. During the process of development and debugging of gas turbine engine components, different versions of these components must be manufactured and tested. Cooled blades of the turbine are among of these components. They are usually produced by traditional casting methods. This method requires long and costly design and manufacture of casting molds. Moreover, traditional manufacturing methods limit the design possibilities of complex critical parts of engine, so capabilities of Powder Metallurgy Techniques (PMT) were analyzed to manufacture the turbine wheel with air-cooled blades. PMT dramatically reduce time needed for such production and allow creating new complex design solutions aimed at improving the technical characteristics of the engine: improving fuel efficiency and environmental performance, increasing reliability, and reducing weight. To accelerate and simplify the blades manufacturing process, several options based on additive technologies were used. The options were implemented in the form of various casting equipment for the manufacturing of blades. Methods of powder metallurgy were applied for connecting the blades with the disc. The optimal production scheme and a set of technologies for the manufacturing of blades and turbine wheel and other parts of the engine can be selected on the basis of the options considered.Keywords: additive technologies, gas turbine engine, powder technology, turbine wheel
Procedia PDF Downloads 32312127 Virtual 3D Environments for Image-Based Navigation Algorithms
Authors: V. B. Bastos, M. P. Lima, P. R. G. Kurka
Abstract:
This paper applies to the creation of virtual 3D environments for the study and development of mobile robot image based navigation algorithms and techniques, which need to operate robustly and efficiently. The test of these algorithms can be performed in a physical way, from conducting experiments on a prototype, or by numerical simulations. Current simulation platforms for robotic applications do not have flexible and updated models for image rendering, being unable to reproduce complex light effects and materials. Thus, it is necessary to create a test platform that integrates sophisticated simulated applications of real environments for navigation, with data and image processing. This work proposes the development of a high-level platform for building 3D model’s environments and the test of image-based navigation algorithms for mobile robots. Techniques were used for applying texture and lighting effects in order to accurately represent the generation of rendered images regarding the real world version. The application will integrate image processing scripts, trajectory control, dynamic modeling and simulation techniques for physics representation and picture rendering with the open source 3D creation suite - Blender.Keywords: simulation, visual navigation, mobile robot, data visualization
Procedia PDF Downloads 26012126 Effect Analysis of an Improved Adaptive Speech Noise Reduction Algorithm in Online Communication Scenarios
Authors: Xingxing Peng
Abstract:
With the development of society, there are more and more online communication scenarios such as teleconference and online education. In the process of conference communication, the quality of voice communication is a very important part, and noise may cause the communication effect of participants to be greatly reduced. Therefore, voice noise reduction has an important impact on scenarios such as voice calls. This research focuses on the key technologies of the sound transmission process. The purpose is to maintain the audio quality to the maximum so that the listener can hear clearer and smoother sound. Firstly, to solve the problem that the traditional speech enhancement algorithm is not ideal when dealing with non-stationary noise, an adaptive speech noise reduction algorithm is studied in this paper. Traditional noise estimation methods are mainly used to deal with stationary noise. In this chapter, we study the spectral characteristics of different noise types, especially the characteristics of non-stationary Burst noise, and design a noise estimator module to deal with non-stationary noise. Noise features are extracted from non-speech segments, and the noise estimation module is adjusted in real time according to different noise characteristics. This adaptive algorithm can enhance speech according to different noise characteristics, improve the performance of traditional algorithms to deal with non-stationary noise, so as to achieve better enhancement effect. The experimental results show that the algorithm proposed in this chapter is effective and can better adapt to different types of noise, so as to obtain better speech enhancement effect.Keywords: speech noise reduction, speech enhancement, self-adaptation, Wiener filter algorithm
Procedia PDF Downloads 6312125 A More Sustainable Decellularized Plant Scaffold for Lab Grown Meat with Ocean Water
Authors: Isabella Jabbour
Abstract:
The world's population is expected to reach over 10 billion by 2050, creating a significant demand for food production, particularly in the agricultural industry. Cellular agriculture presents a solution to this challenge by producing meat that resembles traditionally produced meat, but with significantly less land use. Decellularized plant scaffolds, such as spinach leaves, have been shown to be a suitable edible scaffold for growing animal muscle, enabling cultured cells to grow and organize into three-dimensional structures that mimic the texture and flavor of conventionally produced meat. However, the use of freshwater to remove the intact extracellular material from these plants remains a concern, particularly when considering scaling up the production process. In this study, two protocols were used, 1X SDS and Boom Sauce, to decellularize spinach leaves with both distilled water and ocean water. The decellularization process was confirmed by histology, which showed an absence of cell nuclei, DNA and protein quantification. Results showed that spinach decellularized with ocean water contained 9.9 ± 1.4 ng DNA/mg tissue, which is comparable to the 9.2 ± 1.1 ng DNA/mg tissue obtained with DI water. These findings suggest that decellularized spinach leaves using ocean water hold promise as an eco-friendly and cost-effective scaffold for laboratory-grown meat production, which could ultimately transform the meat industry by providing a sustainable alternative to traditional animal farming practices while reducing freshwater use.Keywords: cellular agriculture, plant scaffold, decellularization, ocean water usage
Procedia PDF Downloads 9912124 Modeling of Nitrogen Solubility in Stainless Steel
Authors: Saeed Ghali, Hoda El-Faramawy, Mamdouh Eissa, Michael Mishreky
Abstract:
Scale-resistant austenitic stainless steel, X45CrNiW 18-9, has been developed, and modified steels produced through partial and total nickel replacement by nitrogen. These modified steels were produced in a 10 kg induction furnace under different nitrogen pressures and were cast into ingots. The produced modified stainless steels were forged, followed by air cooling. The phases of modified stainless steels have been investigated using the Schaeffler diagram, dilatometer, and microstructure observations. Both partial and total replacement of nickel using 0.33-0.50% nitrogen are effective in producing fully austenitic stainless steels. The nitrogen contents were determined and compared with those calculated using the Institute of Metal Science (IMS) equation. The results showed great deviations between the actual nitrogen contents and predicted values through IMS equation. So, an equation has been derived based on chemical composition, pressure, and temperature at 1600oC. [N%] = 0.0078 + 0.0406*X, where X is a function of chemical composition and nitrogen pressure. The derived equation has been used to calculate the nitrogen content of different steels using published data. The results reveal the difficulty of deriving a general equation for the prediction of nitrogen content covering different steel compositions. So, it is necessary to use a narrow composition range.Keywords: solubility, nitrogen, stainless steel, Schaeffler
Procedia PDF Downloads 24212123 Assessing Image Quality in Mobile Radiography: A Phantom-Based Evaluation of a New Lightweight Mobile X-Ray Equipment
Authors: May Bazzi, Shafik Tokmaj, Younes Saberi, Mats Geijer, Tony Jurkiewicz, Patrik Sund, Anna Bjällmark
Abstract:
Mobile radiography, employing portable X-ray equipment, has become a routine procedure within hospital settings, with chest X-rays in intensive care units standing out as the most prevalent mobile X-ray examinations. This approach is not limited to hospitals alone, as it extends its benefits to imaging patients in various settings, particularly those too frail to be transported, such as elderly care residents in nursing homes. Moreover, the utility of mobile X-ray isn't confined solely to traditional healthcare recipients; it has proven to be a valuable resource for vulnerable populations, including the homeless, drug users, asylum seekers, and patients with multiple co-morbidities. Mobile X-rays reduce patient stress, minimize costly hospitalizations, and offer cost-effective imaging. While studies confirm its reliability, further research is needed, especially regarding image quality. Recent advancements in lightweight equipment with enhanced battery and detector technology provide the potential for nearly handheld radiography. The main aim of this study was to evaluate a new lightweight mobile X-ray system with two different detectors and compare the image quality with a modern stationary system. Methods: A total of 74 images of the chest (chest anterior-posterior (AP) views and chest lateral views) and pelvic/hip region (AP pelvis views, hip AP views, and hip cross-table lateral views) were acquired on a whole-body phantom (Kyotokagaku, Japan), utilizing varying image parameters. These images were obtained using a stationary system - 18 images (Mediel, Sweden), a mobile X-ray system with a second-generation detector - 28 images (FDR D-EVO II; Fujifilm, Japan) and a mobile X-ray system with a third-generation detector - 28 images (FDR D-EVO III; Fujifilm, Japan). Image quality was assessed by visual grading analysis (VGA), which is a method to measure image quality by assessing the visibility and accurate reproduction of anatomical structures within the images. A total of 33 image criteria were used in the analysis. A panel of two experienced radiologists, two experienced radiographers, and two final-term radiographer students evaluated the image quality on a 5-grade ordinal scale using the software Viewdex 3.0 (Viewer for Digital Evaluation of X-ray images, Sweden). Data were analyzed using visual grading characteristics analysis. The dose was measured by the dose-area product (DAP) reported by the respective systems. Results: The mobile X-ray equipment (both detectors) showed significantly better image quality than the stationary equipment for the pelvis, hip AP and hip cross-table lateral images with AUCVGA-values ranging from 0.64-0.92, while chest images showed mixed results. The number of images rated as having sufficient quality for diagnostic use was significantly higher for mobile X-ray generation 2 and 3 compared with the stationary X-ray system. The DAP values were higher for the stationary compared to the mobile system. Conclusions: The new lightweight radiographic equipment had an image quality at least as good as a fixed system at a lower radiation dose. Future studies should focus on clinical images and consider radiographers' viewpoints for a comprehensive assessment.Keywords: mobile x-ray, visual grading analysis, radiographer, radiation dose
Procedia PDF Downloads 6912122 Kinetic Evaluation of Sterically Hindered Amines under Partial Oxy-Combustion Conditions
Authors: Sara Camino, Fernando Vega, Mercedes Cano, Benito Navarrete, José A. Camino
Abstract:
Carbon capture and storage (CCS) technologies should play a relevant role towards low-carbon systems in the European Union by 2030. Partial oxy-combustion emerges as a promising CCS approach to mitigate anthropogenic CO₂ emissions. Its advantages respect to other CCS technologies rely on the production of a higher CO₂ concentrated flue gas than these provided by conventional air-firing processes. The presence of more CO₂ in the flue gas increases the driving force in the separation process and hence it might lead to further reductions of the energy requirements of the overall CO₂ capture process. A higher CO₂ concentrated flue gas should enhance the CO₂ capture by chemical absorption in solvent kinetic and CO₂ cyclic capacity. They have impact on the performance of the overall CO₂ absorption process by reducing the solvent flow-rate required for a specific CO₂ removal efficiency. Lower solvent flow-rates decreases the reboiler duty during the regeneration stage and also reduces the equipment size and pumping costs. Moreover, R&D activities in this field are focused on novel solvents and blends that provide lower CO₂ absorption enthalpies and therefore lower energy penalties associated to the solvent regeneration. In this respect, sterically hindered amines are considered potential solvents for CO₂ capture. They provide a low energy requirement during the regeneration process due to its molecular structure. However, its absorption kinetics are slow and they must be promoted by blending with faster solvents such as monoethanolamine (MEA) and piperazine (PZ). In this work, the kinetic behavior of two sterically hindered amines were studied under partial oxy-combustion conditions and compared with MEA. A lab-scale semi-batch reactor was used. The CO₂ composition of the synthetic flue gas varied from 15%v/v – conventional coal combustion – to 60%v/v – maximum CO₂ concentration allowable for an optimal partial oxy-combustion operation. Firstly, 2-amino-2-methyl-1-propanol (AMP) showed a hybrid behavior with fast kinetics and a low enthalpy of CO₂ absorption. The second solvent was Isophrondiamine (IF), which has a steric hindrance in one of the amino groups. Its free amino group increases its cyclic capacity. In general, the presence of higher CO₂ concentration in the flue gas accelerated the CO₂ absorption phenomena, producing higher CO₂ absorption rates. In addition, the evolution of the CO2 loading also exhibited higher values in the experiments using higher CO₂ concentrated flue gas. The steric hindrance causes a hybrid behavior in this solvent, between both fast and slow kinetic solvents. The kinetics rates observed in all the experiments carried out using AMP were higher than MEA, but lower than the IF. The kinetic enhancement experienced by AMP at a high CO2 concentration is slightly over 60%, instead of 70% – 80% for IF. AMP also improved its CO₂ absorption capacity by 24.7%, from 15%v/v to 60%v/v, almost double the improvements achieved by MEA. In IF experiments, the CO₂ loading increased around 10% from 15%v/v to 60%v/v CO₂ and it changed from 1.10 to 1.34 mole CO₂ per mole solvent, more than 20% of increase. This hybrid kinetic behavior makes AMP and IF promising solvents for partial oxy–combustion applications.Keywords: absorption, carbon capture, partial oxy-combustion, solvent
Procedia PDF Downloads 19412121 Bayesian Value at Risk Forecast Using Realized Conditional Autoregressive Expectiel Mdodel with an Application of Cryptocurrency
Authors: Niya Chen, Jennifer Chan
Abstract:
In the financial market, risk management helps to minimize potential loss and maximize profit. There are two ways to assess risks; the first way is to calculate the risk directly based on the volatility. The most common risk measurements are Value at Risk (VaR), sharp ratio, and beta. Alternatively, we could look at the quantile of the return to assess the risk. Popular return models such as GARCH and stochastic volatility (SV) focus on modeling the mean of the return distribution via capturing the volatility dynamics; however, the quantile/expectile method will give us an idea of the distribution with the extreme return value. It will allow us to forecast VaR using return which is direct information. The advantage of using these non-parametric methods is that it is not bounded by the distribution assumptions from the parametric method. But the difference between them is that expectile uses a second-order loss function while quantile regression uses a first-order loss function. We consider several quantile functions, different volatility measures, and estimates from some volatility models. To estimate the expectile of the model, we use Realized Conditional Autoregressive Expectile (CARE) model with the bayesian method to achieve this. We would like to see if our proposed models outperform existing models in cryptocurrency, and we will test it by using Bitcoin mainly as well as Ethereum.Keywords: expectile, CARE Model, CARR Model, quantile, cryptocurrency, Value at Risk
Procedia PDF Downloads 11512120 Sparse Modelling of Cancer Patients’ Survival Based on Genomic Copy Number Alterations
Authors: Khaled M. Alqahtani
Abstract:
Copy number alterations (CNA) are variations in the structure of the genome, where certain regions deviate from the typical two chromosomal copies. These alterations are pivotal in understanding tumor progression and are indicative of patients' survival outcomes. However, effectively modeling patients' survival based on their genomic CNA profiles while identifying relevant genomic regions remains a statistical challenge. Various methods, such as the Cox proportional hazard (PH) model with ridge, lasso, or elastic net penalties, have been proposed but often overlook the inherent dependencies between genomic regions, leading to results that are hard to interpret. In this study, we enhance the elastic net penalty by incorporating an additional penalty that accounts for these dependencies. This approach yields smooth parameter estimates and facilitates variable selection, resulting in a sparse solution. Our findings demonstrate that this method outperforms other models in predicting survival outcomes, as evidenced by our simulation study. Moreover, it allows for a more meaningful interpretation of genomic regions associated with patients' survival. We demonstrate the efficacy of our approach using both real data from a lung cancer cohort and simulated datasets.Keywords: copy number alterations, cox proportional hazard, lung cancer, regression, sparse solution
Procedia PDF Downloads 5112119 The Effect of Market Orientation on Marketing Performance through Product Adaptation Strategy
Authors: Hotlan Siagian, Hatane Semuel, Wilma Laura Sahetapy
Abstract:
This study aims at examining the effect of market orientation on marketing performance through product adaptation strategy. The population of the research is domestic leather craft companies located in five regions, the center of the leather craft industry in Indonesia, i.e., Central Java, East Java, South Sulawesi, Bali, and West Kalimantan. The respondent consists of a manager level from each company. Data collection used a questionnaire designed with five-item Likert scale. Collected data were analyzed using structural equation modeling (SEM) technique with SmartPLS software version 3.0 to examine the hypotheses. The result of the study shows that all hypotheses are supported. Market orientation affects marketing performance. Market orientation affects product adaptation strategy. Product adaptation strategy influences the marketing performance. The research also has revealed the main finding that product adaptation strategy contributes to a mediating role in the market orientation strategy and marketing performance relationship. The leather craft companies in Indonesia, therefore, may refer to this result in improving their marketing performance.Keywords: leather craft industry, market orientation, marketing performance, product adaptation strategy
Procedia PDF Downloads 36712118 Regenerating Habitats. A Housing Based on Modular Wooden Systems
Authors: Rui Pedro de Sousa Guimarães Ferreira, Carlos Alberto Maia Domínguez
Abstract:
Despite the ambitions to achieve climate neutrality by 2050, to fulfill the Paris Agreement's goals, the building and construction sector remains one of the most resource-intensive and greenhouse gas-emitting industries in the world, accounting for 40% of worldwide CO ₂ emissions. Over the past few decades, globalization and population growth have led to an exponential rise in demand in the housing market and, by extension, in the building industry. Considering this housing crisis, it is obvious that we will not stop building in the near future. However, the transition, which has already started, is challenging and complex because it calls for the worldwide participation of numerous organizations in altering how building systems, which have been a part of our everyday existence for over a century, are used. Wood is one of the alternatives that is most frequently used nowadays (under responsible forestry conditions) because of its physical qualities and, most importantly, because it produces fewer carbon emissions during manufacturing than steel or concrete. Furthermore, as wood retains its capacity to store CO ₂ after application and throughout the life of the building, working as a natural carbon filter, it helps to reduce greenhouse gas emissions. After a century-long focus on other materials, in the last few decades, technological advancements have made it possible to innovate systems centered around the use of wood. However, there are still some questions that require further exploration. It is necessary to standardize production and manufacturing processes based on prefabrication and modularization principles to achieve greater precision and optimization of the solutions, decreasing building time, prices, and waste from raw materials. In addition, this approach will make it possible to develop new architectural solutions to solve the rigidity and irreversibility of buildings, two of the most important issues facing housing today. Most current models are still created as inflexible, fixed, monofunctional structures that discourage any kind of regeneration, based on matrices that sustain the conventional family's traditional model and are founded on rigid, impenetrable compartmentalization. Adaptability and flexibility in housing are, and always have been, necessities and key components of architecture. People today need to constantly adapt to their surroundings and themselves because of the fast-paced, disposable, and quickly obsolescent nature of modern items. Migrations on a global scale, different kinds of co-housing, or even personal changes are some of the new questions that buildings have to answer. Designing with the reversibility of construction systems and materials in mind not only allows for the concept of "looping" in construction, with environmental advantages that enable the development of a circular economy in the sector but also unleashes multiple social benefits. In this sense, it is imperative to develop prefabricated and modular construction systems able to address the formalization of a reversible proposition that adjusts to the scale of time and its multiple reformulations, many of which are unpredictable. We must allow buildings to change, grow, or shrink over their lifetime, respecting their nature and, finally, the nature of the people living in them. It´s the ability to anticipate the unexpected, adapt to social factors, and take account of demographic shifts in society to stabilize communities, the foundation of real innovative sustainability.Keywords: modular, timber, flexibility, housing
Procedia PDF Downloads 8712117 Machine Learning Approach for Automating Electronic Component Error Classification and Detection
Authors: Monica Racha, Siva Chandrasekaran, Alex Stojcevski
Abstract:
The engineering programs focus on promoting students' personal and professional development by ensuring that students acquire technical and professional competencies during four-year studies. The traditional engineering laboratory provides an opportunity for students to "practice by doing," and laboratory facilities aid them in obtaining insight and understanding of their discipline. Due to rapid technological advancements and the current COVID-19 outbreak, the traditional labs were transforming into virtual learning environments. Aim: To better understand the limitations of the physical laboratory, this research study aims to use a Machine Learning (ML) algorithm that interfaces with the Augmented Reality HoloLens and predicts the image behavior to classify and detect the electronic components. The automated electronic components error classification and detection automatically detect and classify the position of all components on a breadboard by using the ML algorithm. This research will assist first-year undergraduate engineering students in conducting laboratory practices without any supervision. With the help of HoloLens, and ML algorithm, students will reduce component placement error on a breadboard and increase the efficiency of simple laboratory practices virtually. Method: The images of breadboards, resistors, capacitors, transistors, and other electrical components will be collected using HoloLens 2 and stored in a database. The collected image dataset will then be used for training a machine learning model. The raw images will be cleaned, processed, and labeled to facilitate further analysis of components error classification and detection. For instance, when students conduct laboratory experiments, the HoloLens captures images of students placing different components on a breadboard. The images are forwarded to the server for detection in the background. A hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm will be used to train the dataset for object recognition and classification. The convolution layer extracts image features, which are then classified using Support Vector Machine (SVM). By adequately labeling the training data and classifying, the model will predict, categorize, and assess students in placing components correctly. As a result, the data acquired through HoloLens includes images of students assembling electronic components. It constantly checks to see if students appropriately position components in the breadboard and connect the components to function. When students misplace any components, the HoloLens predicts the error before the user places the components in the incorrect proportion and fosters students to correct their mistakes. This hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm automating electronic component error classification and detection approach eliminates component connection problems and minimizes the risk of component damage. Conclusion: These augmented reality smart glasses powered by machine learning provide a wide range of benefits to supervisors, professionals, and students. It helps customize the learning experience, which is particularly beneficial in large classes with limited time. It determines the accuracy with which machine learning algorithms can forecast whether students are making the correct decisions and completing their laboratory tasks.Keywords: augmented reality, machine learning, object recognition, virtual laboratories
Procedia PDF Downloads 14212116 The Role Collagen VI Plays in Heart Failure: A Tale Untold
Authors: Summer Hassan, David Crossman
Abstract:
Myocardial fibrosis (MF) has been loosely defined as the process occurring in the pathological remodeling of the myocardium due to excessive production and deposition of extracellular matrix (ECM) proteins, including collagen. This reduces tissue compliance and accelerates progression to heart failure, as well as affecting the electrical properties of the myocytes resulting in arrhythmias. Microscopic interrogation of MF is key to understanding the molecular orchestrators of disease. It is well-established that recruitment and stimulation of myofibroblasts result in Collagen deposition and the resulting expansion in the ECM. Many types of Collagens have been identified and implicated in scarring of tissue. In a series of experiments conducted at our lab, we aim to elucidate the role collagen VI plays in the development of myocardial fibrosis and its direct impact on myocardial function. This was investigated through an animal experiment in Rats with Collagen VI knockout diseased and healthy animals as well as Collagen VI wild diseased and healthy rats. Echocardiogram assessments of these rats ensued at four-time points, followed by microscopic interrogation of the myocardium aiming to correlate the role collagen VI plays in myocardial function. Our results demonstrate a deterioration in cardiac function as represented by the ejection fraction in the knockout healthy and diseased rats. This elucidates a potential protective role that collagen-VI plays following a myocardial insult. Current work is dedicated to the microscopic characterisation of the fibrotic process in all rat groups, with the results to follow.Keywords: heart failure, myocardial fibrosis, collagen, echocardiogram, confocal microscopy
Procedia PDF Downloads 8512115 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context
Authors: M. G. N. A. S. Fernando
Abstract:
This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies
Procedia PDF Downloads 16912114 Challenge of Baseline Hydrology Estimation at Large-Scale Watersheds
Authors: Can Liu, Graham Markowitz, John Balay, Ben Pratt
Abstract:
Baseline or natural hydrology is commonly employed for hydrologic modeling and quantification of hydrologic alteration due to manmade activities. It can inform planning and policy related efforts for various state and federal water resource agencies to restore natural streamflow flow regimes. A common challenge faced by hydrologists is how to replicate unaltered streamflow conditions, particularly in large watershed settings prone to development and regulation. Three different methods were employed to estimate baseline streamflow conditions for 6 major subbasins the Susquehanna River Basin; those being: 1) incorporation of consumptive water use and reservoir operations back into regulated gaged records; 2) using a map correlation method and flow duration (exceedance probability) regression equations; 3) extending the pre-regulation streamflow records based on the relationship between concurrent streamflows at unregulated and regulated gage locations. Parallel analyses were perform among the three methods and limitations associated with each are presented. Results from these analyses indicate that generating baseline streamflow records at large-scale watersheds remain challenging, even with long-term continuous stream gage records available.Keywords: baseline hydrology, streamflow gage, subbasin, regression
Procedia PDF Downloads 32712113 Mining Coupled to Agriculture: Systems Thinking in Scalable Food Production
Authors: Jason West
Abstract:
Low profitability in agriculture production along with increasing scrutiny over environmental effects is limiting food production at scale. In contrast, the mining sector offers access to resources including energy, water, transport and chemicals for food production at low marginal cost. Scalable agricultural production can benefit from the nexus of resources (water, energy, transport) offered by mining activity in remote locations. A decision support bioeconomic model for controlled environment vertical farms was used. Four submodels were used: crop structure, nutrient requirements, resource-crop integration, and economic. They escalate to a macro mathematical model. A demonstrable dynamic systems framework is needed to prove productive outcomes are feasible. We demonstrate a generalized bioeconomic macro model for controlled environment production systems in minesites using systems dynamics modeling methodology. Despite the complexity of bioeconomic modelling of resource-agricultural dynamic processes and interactions, the economic potential greater than general economic models would assume. Scalability of production as an input becomes a key success feature.Keywords: crop production systems, mathematical model, mining, agriculture, dynamic systems
Procedia PDF Downloads 8012112 Virtual Customer Integration in Innovation Development: A Systematic Literature Review
Authors: Chau Nguyen Pham Minh
Abstract:
The aim of this study is to answer the following research question: What do we know about virtual customer integration in innovation development based on existing empirical research? The paper is based on a systematic review of 136 articles which were published in the past 16 years. The analysis focuses on three areas: what forms of virtual customer integration (e.g. netnography, online co-creation, virtual experience) have been applied in innovation development; how have virtual customer integration methods effectively been utilized by firms; and what are the influences of virtual customer integration on innovation development activities? Through the detailed analysis, the study provides researchers with broad understanding about virtual customer integration in innovation development. The study shows that practitioners and researchers increasingly pay attention on using virtual customer integration methods in developing innovation since those methods have dominant advantages in interact with customers in order to generate the best ideas for innovation development. Additionally, the findings indicate that netnography has been the most common method in integrating with customers for idea generation; while virtual product experience has been mainly used in product testing. Moreover, the analysis also reveals the positive and negative influences of virtual customer integration in innovation development from both process and strategic perspectives. Most of the review studies examined the phenomenon from company’s perspectives to understand the process of applying virtual customer integration methods and their impacts; however, the customers’ perspective on participating in the virtual interaction has been inadequately studied; therefore, it creates many potential interesting research paths for future studies.Keywords: innovation, virtual customer integration, co-creation, netnography, new product development
Procedia PDF Downloads 339