Search results for: design parameter
8578 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings
Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller
Abstract:
Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram
Procedia PDF Downloads 2658577 A Study on ZnO Nanoparticles Properties: An Integration of Rietveld Method and First-Principles Calculation
Authors: Kausar Harun, Ahmad Azmin Mohamad
Abstract:
Zinc oxide (ZnO) has been extensively used in optoelectronic devices, with recent interest as photoanode material in dye-sensitize solar cell. Numerous methods employed to experimentally synthesized ZnO, while some are theoretically-modeled. Both approaches provide information on ZnO properties, but theoretical calculation proved to be more accurate and timely effective. Thus, integration between these two methods is essential to intimately resemble the properties of synthesized ZnO. In this study, experimentally-grown ZnO nanoparticles were prepared by sol-gel storage method with zinc acetate dihydrate and methanol as precursor and solvent. A 1 M sodium hydroxide (NaOH) solution was used as stabilizer. The optimum time to produce ZnO nanoparticles were recorded as 12 hours. Phase and structural analysis showed that single phase ZnO produced with wurtzite hexagonal structure. Further work on quantitative analysis was done via Rietveld-refinement method to obtain structural and crystallite parameter such as lattice dimensions, space group, and atomic coordination. The lattice dimensions were a=b=3.2498Å and c=5.2068Å which were later used as main input in first-principles calculations. By applying density-functional theory (DFT) embedded in CASTEP computer code, the structure of synthesized ZnO was built and optimized using several exchange-correlation functionals. The generalized-gradient approximation functional with Perdew-Burke-Ernzerhof and Hubbard U corrections (GGA-PBE+U) showed the structure with lowest energy and lattice deviations. In this study, emphasize also given to the modification of valence electron energy level to overcome the underestimation in DFT calculation. Both Zn and O valance energy were fixed at Ud=8.3 eV and Up=7.3 eV, respectively. Hence, the following electronic and optical properties of synthesized ZnO were calculated based on GGA-PBE+U functional within ultrasoft-pseudopotential method. In conclusion, the incorporation of Rietveld analysis into first-principles calculation was valid as the resulting properties were comparable with those reported in literature. The time taken to evaluate certain properties via physical testing was then eliminated as the simulation could be done through computational method.Keywords: density functional theory, first-principles, Rietveld-refinement, ZnO nanoparticles
Procedia PDF Downloads 3098576 Examining the Design of a Scaled Audio Tactile Model for Enhancing Interpretation of Visually Impaired Visitors in Heritage Sites
Authors: A. Kavita Murugkar, B. Anurag Kashyap
Abstract:
With the Rights for Persons with Disabilities Act (RPWD Act) 2016, the Indian government has made it mandatory for all establishments, including Heritage Sites, to be accessible for People with Disabilities. However, recent access audit surveys done under the Accessible India Campaign by Ministry of Culture indicate that there are very few accessibility measures provided in the Heritage sites for people with disabilities. Though there are some measures for the mobility impaired, surveys brought out that there are almost no provisions for people with vision impairment (PwVI) in heritage sites thus depriving them of a reasonable physical & intellectual access that facilitates an enjoyable experience and enriching interpretation of the Heritage Site. There is a growing need to develop multisensory interpretative tools that can help the PwVI in perceiving heritage sites in the absence of vision. The purpose of this research was to examine the usability of an audio-tactile model as a haptic and sound-based strategy for augmenting the perception and experience of PwVI in a heritage site. The first phase of the project was a multi-stage phenomenological experimental study with visually impaired users to investigate the design parameters for developing an audio-tactile model for PwVI. The findings from this phase included user preferences related to the physical design of the model such as the size, scale, materials, details, etc., and the information that it will carry such as braille, audio output, tactile text, etc. This was followed by the second phase in which a working prototype of an audio-tactile model is designed and developed for a heritage site based on the findings from the first phase of the study. A nationally listed heritage site from the author’s city was selected for making the model. The model was lastly tested by visually impaired users for final refinements and validation. The prototype developed empowers People with Vision Impairment to navigate independently in heritage sites. Such a model if installed in every heritage site, can serve as a technological guide for the Person with Vision Impairment, giving information of the architecture, details, planning & scale of the buildings, the entrances, location of important features, lifts, staircases, and available, accessible facilities. The model was constructed using 3D modeling and digital printing technology. Though designed for the Indian context, this assistive technology for the blind can be explored for wider applications across the globe. Such an accessible solution can change the otherwise “incomplete’’ perception of the disabled visitor, in this case, a visually impaired visitor and augment the quality of their experience in heritage sites.Keywords: accessibility, architectural perception, audio tactile model , inclusive heritage, multi-sensory perception, visual impairment, visitor experience
Procedia PDF Downloads 1068575 Sexual Health And Male Fertility: Improving Sperm Health With Focus On Technology
Authors: Diana Peninger
Abstract:
Over 10% of couples in the U.S. have infertility problems, with roughly 40% traceable to the male partner. Yet, little attention has been given to improving men’s contribution to the conception process. One solution that is showing promise in increasing conception rates for IVF and other assisted reproductive technology treatments is a first-of-its-kind semen collection that has been engineered to mitigate sperm damage caused by traditional collection methods. Patients are able to collect semen at home and deliver to clinics within 48 hours for use in fertility analysis and treatment, with less stress and improved specimen viability. This abstract will share these findings along with expert insight and tips to help attendees understand the key role sperm collection plays in addressing and treating reproductive issues, while helping to improve patient outcomes and success. Our research was to determine if male reproductive outcomes can be increased by improving sperm specimen health with a focus on technology. We utilized a redesigned semen collection cup (patented as the Device for Improved Semen Collection/DISC—U.S. Patent 6864046 – known commercially as a ProteX) that met a series of physiological parameters. Previous research demonstrated significant improvement in semen perimeters (motility forward, progression, viability, and longevity) and overall sperm biochemistry when the DISC is used for collection. Animal studies have also shown dramatic increases in pregnancy rates. Our current study compares samples collected in the DISC, next-generation DISC (DISCng), and a standard specimen cup (SSC), dry, with the 1 mL measured amount of media and media in excess ( 5mL). Both human and animal testing will be included. With sperm counts declining at alarming rates due to environmental, lifestyle, and other health factors, accurate evaluations of sperm health are critical to understanding reproductive health, origins, and treatments of infertility. An increase in the health of the sperm as measured by extensive semen parameter analysis and improved semen parameters stable for 48 hours, expanding the processing time from 1 hour to 48 hours were also demonstrated.Keywords: reprodutive, sperm, male, infertility
Procedia PDF Downloads 1298574 Control of a Quadcopter Using Genetic Algorithm Methods
Authors: Mostafa Mjahed
Abstract:
This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system
Procedia PDF Downloads 4318573 Physics-Based Earthquake Source Models for Seismic Engineering: Analysis and Validation for Dip-Slip Faults
Authors: Percy Galvez, Anatoly Petukhin, Paul Somerville, Ken Miyakoshi, Kojiro Irikura, Daniel Peter
Abstract:
Physics-based dynamic rupture modelling is necessary for estimating parameters such as rupture velocity and slip rate function that are important for ground motion simulation, but poorly resolved by observations, e.g. by seismic source inversion. In order to generate a large number of physically self-consistent rupture models, whose rupture process is consistent with the spatio-temporal heterogeneity of past earthquakes, we use multicycle simulations under the heterogeneous rate-and-state (RS) friction law for a 45deg dip-slip fault. We performed a parametrization study by fully dynamic rupture modeling, and then, a set of spontaneous source models was generated in a large magnitude range (Mw > 7.0). In order to validate rupture models, we compare the source scaling relations vs. seismic moment Mo for the modeled rupture area S, as well as average slip Dave and the slip asperity area Sa, with similar scaling relations from the source inversions. Ground motions were also computed from our models. Their peak ground velocities (PGV) agree well with the GMPE values. We obtained good agreement of the permanent surface offset values with empirical relations. From the heterogeneous rupture models, we analyzed parameters, which are critical for ground motion simulations, i.e. distributions of slip, slip rate, rupture initiation points, rupture velocities, and source time functions. We studied cross-correlations between them and with the friction weakening distance Dc value, the only initial heterogeneity parameter in our modeling. The main findings are: (1) high slip-rate areas coincide with or are located on an outer edge of the large slip areas, (2) ruptures have a tendency to initiate in small Dc areas, and (3) high slip-rate areas correlate with areas of small Dc, large rupture velocity and short rise-time.Keywords: earthquake dynamics, strong ground motion prediction, seismic engineering, source characterization
Procedia PDF Downloads 1448572 Experimental and Modal Determination of the State-Space Model Parameters of a Uni-Axial Shaker System for Virtual Vibration Testing
Authors: Jonathan Martino, Kristof Harri
Abstract:
In some cases, the increase in computing resources makes simulation methods more affordable. The increase in processing speed also allows real time analysis or even more rapid tests analysis offering a real tool for test prediction and design process optimization. Vibration tests are no exception to this trend. The so called ‘Virtual Vibration Testing’ offers solution among others to study the influence of specific loads, to better anticipate the boundary conditions between the exciter and the structure under test, to study the influence of small changes in the structure under test, etc. This article will first present a virtual vibration test modeling with a main focus on the shaker model and will afterwards present the experimental parameters determination. The classical way of modeling a shaker is to consider the shaker as a simple mechanical structure augmented by an electrical circuit that makes the shaker move. The shaker is modeled as a two or three degrees of freedom lumped parameters model while the electrical circuit takes the coil impedance and the dynamic back-electromagnetic force into account. The establishment of the equations of this model, describing the dynamics of the shaker, is presented in this article and is strongly related to the internal physical quantities of the shaker. Those quantities will be reduced into global parameters which will be estimated through experiments. Different experiments will be carried out in order to design an easy and practical method for the identification of the shaker parameters leading to a fully functional shaker model. An experimental modal analysis will also be carried out to extract the modal parameters of the shaker and to combine them with the electrical measurements. Finally, this article will conclude with an experimental validation of the model.Keywords: lumped parameters model, shaker modeling, shaker parameters, state-space, virtual vibration
Procedia PDF Downloads 2708571 The Future of Insurance: P2P Innovation versus Traditional Business Model
Authors: Ivan Sosa Gomez
Abstract:
Digitalization has impacted the entire insurance value chain, and the growing movement towards P2P platforms and the collaborative economy is also beginning to have a significant impact. P2P insurance is defined as innovation, enabling policyholders to pool their capital, self-organize, and self-manage their own insurance. In this context, new InsurTech start-ups are emerging as peer-to-peer (P2P) providers, based on a model that differs from traditional insurance. As a result, although P2P platforms do not change the fundamental basis of insurance, they do enable potentially more efficient business models to be established in terms of ensuring the coverage of risk. It is therefore relevant to determine whether p2p innovation can have substantial effects on the future of the insurance sector. For this purpose, it is considered necessary to develop P2P innovation from a business perspective, as well as to build a comparison between a traditional model and a P2P model from an actuarial perspective. Objectives: The objectives are (1) to represent P2P innovation in the business model compared to the traditional insurance model and (2) to establish a comparison between a traditional model and a P2P model from an actuarial perspective. Methodology: The research design is defined as action research in terms of understanding and solving the problems of a collectivity linked to an environment, applying theory and best practices according to the approach. For this purpose, the study is carried out through the participatory variant, which involves the collaboration of the participants, given that in this design, participants are considered experts. For this purpose, prolonged immersion in the field is carried out as the main instrument for data collection. Finally, an actuarial model is developed relating to the calculation of premiums that allows for the establishment of projections of future scenarios and the generation of conclusions between the two models. Main Contributions: From an actuarial and business perspective, we aim to contribute by developing a comparison of the two models in the coverage of risk in order to determine whether P2P innovation can have substantial effects on the future of the insurance sector.Keywords: Insurtech, innovation, business model, P2P, insurance
Procedia PDF Downloads 928570 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows
Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican
Abstract:
This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.Keywords: laboratory-process, optimization, pathology, computer simulation, workflow
Procedia PDF Downloads 2868569 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics
Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir
Abstract:
Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone
Procedia PDF Downloads 1938568 Impact of Heat Moisture Treatment on the Yield of Resistant Starch and Evaluation of Functional Properties of Modified Mung Bean (Vigna radiate) Starch
Authors: Sreejani Barua, P. P. Srivastav
Abstract:
Formulation of new functional food products for diabetes patients and obsessed people is a challenge for food industries till date. Starch is a certainly happening, ecological, reasonable and profusely obtainable polysaccharide in plant material. In the present scenario, there is a great interest in modifying starch functional properties without destroying its granular structure using different modification techniques. Resistant starch (RS) contains almost zero calories and can control blood glucose level to prevent diabetes. The current study focused on modification of mung bean starch which is a good source of legumes carbohydrate for the production of functional food. Heat moisture treatment (HMT) of mung starch was conducted at moisture content of 10-30%, temperature of 80-120 °C and time of 8-24 h.The content of resistant starch after modification was significantly increased from native starches containing RS 7.6%. The design combinations of HMT had been completed through Central Composite Rotatable Design (CCRD). The effects of HMT process variables on the yield of resistant starch was studied through Rapid Surface Methodology (RSM). The highest increase of resistant starch was found up to 34.39% when treated the native starch with 30% m.c at 120 °C temperature for 24 h.The functional properties of both native and modified mung bean starches showed that there was a reduction in the swelling power and swelling volume of HMT starches. However, the solubility of the HMT starches was higher than that of untreated native starch and also observed change in structural (scanning electron microscopy), X-Ray diffraction (XRD) pattern, blue value and thermal (differential scanning calorimetry) properties. Therefore, replacing native mung bean starch with heat-moisture treated mung bean starch leads to the development of new products with higher resistant starch levels and functional properties.Keywords: Mung bean starch, heat moisture treatment, functional properties, resistant starch
Procedia PDF Downloads 2028567 Optimization of Ultrasound Assisted Extraction of Polysaccharides from Plant Waste Materials: Selected Model Material is Hazelnut Skin
Abstract:
In this study, optimization of ultrasound assisted extraction (UAE) of hemicellulose based polysaccharides from plant waste material has been studied. Selected material is hazelnut skin. Extraction variables for the operation are extraction time, amplitude and application temperature. Optimum conditions have been evaluated depending on responses such as amount of wet crude polysaccharide, total carbohydrate content and dried sample. Pretreated hazelnut skin powders were used for the experiments. 10 grams of samples were suspended in 100 ml water in a jacketed vessel with additional magnetic stirring. Mixture was sonicated by immersing ultrasonic probe processor. After the extraction procedures, ethanol soluble and insoluble sides were separated for further examinations. The obtained experimental data were analyzed by analysis of variance (ANOVA). Second order polynomial models were developed using multiple regression analysis. The individual and interactive effects of applied variables were evaluated by Box Behnken Design. The models developed from the experimental design were predictive and good fit with the experimental data with high correlation coefficient value (R2 more than 0.95). Extracted polysaccharides from hazelnut skin are assumed to be pectic polysaccharides according to the literature survey of Fourier Transform Spectrometry (FTIR) analysis results. No more change can be observed between spectrums of different sonication times. Application of UAE at optimized condition has an important effect on extraction of hemicellulose from plant material by satisfying partial hydrolysis to break the bounds with other components in plant cell wall material. This effect can be summarized by varied intensity of microjets and microstreaming at varied sonication conditions.Keywords: hazelnut skin, optimization, polysaccharide, ultrasound assisted extraction
Procedia PDF Downloads 3318566 Analysis of Friction Stir Welding Process for Joining Aluminum Alloy
Authors: A. M. Khourshid, I. Sabry
Abstract:
Friction Stir Welding (FSW), a solid state joining technique, is widely being used for joining Al alloys for aerospace, marine automotive and many other applications of commercial importance. FSW were carried out using a vertical milling machine on Al 5083 alloy pipe. These pipe sections are relatively small in diameter, 5mm, and relatively thin walled, 2 mm. In this study, 5083 aluminum alloy pipe were welded as similar alloy joints using (FSW) process in order to investigate mechanical and microstructural properties .rotation speed 1400 r.p.m and weld speed 10,40,70 mm/min. In order to investigate the effect of welding speeds on mechanical properties, metallographic and mechanical tests were carried out on the welded areas. Vickers hardness profile and tensile tests of the joints as a metallurgical feasibility of friction stir welding for joining Al 6061 aluminum alloy welding was performed on pipe with different thickness 2, 3 and 4 mm,five rotational speeds (485,710,910,1120 and 1400) rpm and a traverse speed (4, 8 and 10)mm/min was applied. This work focuses on two methods such as artificial neural networks using software (pythia) and response surface methodology (RSM) to predict the tensile strength, the percentage of elongation and hardness of friction stir welded 6061 aluminum alloy. An artificial neural network (ANN) model was developed for the analysis of the friction stir welding parameters of 6061 pipe. The tensile strength, the percentage of elongation and hardness of weld joints were predicted by taking the parameters Tool rotation speed, material thickness and travel speed as a function. A comparison was made between measured and predicted data. Response surface methodology (RSM) also developed and the values obtained for the response Tensile strengths, the percentage of elongation and hardness are compared with measured values. The effect of FSW process parameter on mechanical properties of 6061 aluminum alloy has been analyzed in detail.Keywords: friction stir welding (FSW), al alloys, mechanical properties, microstructure
Procedia PDF Downloads 4628565 A 0-1 Goal Programming Approach to Optimize the Layout of Hospital Units: A Case Study in an Emergency Department in Seoul
Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee
Abstract:
This paper proposes a method to optimize the layout of an emergency department (ED) based on real executions of care processes by considering several planning objectives simultaneously. Recently, demand for healthcare services has been dramatically increased. As the demand for healthcare services increases, so do the need for new healthcare buildings as well as the need for redesign and renovating existing ones. The importance of implementation of a standard set of engineering facilities planning and design techniques has been already proved in both manufacturing and service industry with many significant functional efficiencies. However, high complexity of care processes remains a major challenge to apply these methods in healthcare environments. Process mining techniques applied in this study to tackle the problem of complexity and to enhance care process analysis. Process related information such as clinical pathways extracted from the information system of an ED. A 0-1 goal programming approach is then proposed to find a single layout that simultaneously satisfies several goals. The proposed model solved by optimization software CPLEX 12. The solution reached using the proposed method has 42.2% improvement in terms of walking distance of normal patients and 47.6% improvement in walking distance of critical patients at minimum cost of relocation. It has been observed that lots of patients must unnecessarily walk long distances during their visit to the emergency department because of an inefficient design. A carefully designed layout can significantly decrease patient walking distance and related complications.Keywords: healthcare operation management, goal programming, facility layout problem, process mining, clinical processes
Procedia PDF Downloads 2958564 Effect of Volute Tongue Shape and Position on Performance of Turbo Machinery Compressor
Authors: Anuj Srivastava, Kuldeep Kumar
Abstract:
This paper proposes a numerical study of volute tongue design, which affects the centrifugal compressor operating range and pressure recovery. Increased efficiency has been the traditional importance of compressor design. However, the increased operating range has become important in an age of ever-increasing productivity and energy costs in the turbomachinery industry. Efficiency and overall operating range are the two most important parameters studied to evaluate the aerodynamic performance of centrifugal compressor. Volute is one of the components that have significant effect on these two parameters. Choice of volute tongue geometry has major role in compressor performance, also affects performance map. The author evaluates the trade-off on using pull-back tongue geometry on centrifugal compressor performance. In present paper, three different tongue positions and shapes are discussed. These designs are compared in terms of pressure recovery coefficient, pressure loss coefficient, and stable operating range. The detailed flow structures for various volute geometries and pull back angle near tongue are studied extensively to explore the fluid behavior. The viscous Navier-Stokes equations are used to simulate the flow inside the volute. The numerical calculations are compared with thermodynamic 1-D calculations. Author concludes that the increment in compression ratio accompanies with more uniform pressure distribution in the modified tongue shape and location, a uniform static pressure around the circumferential which build a more uniform flow in the impeller and diffuser. Also, the blockage at the tongue of the volute was causing circumferentially nonuniformed pressure along the volute. This nonuniformity may lead impeller and diffuser to operate unstably. However, it is not the volute that directly controls the stall.Keywords: centrifugal compressor volute, tongue geometry, pull-back, compressor performance, flow instability
Procedia PDF Downloads 1638563 Digital Twins: Towards an Overarching Framework for the Built Environment
Authors: Astrid Bagireanu, Julio Bros-Williamson, Mila Duncheva, John Currie
Abstract:
Digital Twins (DTs) have entered the built environment from more established industries like aviation and manufacturing, although there has never been a common goal for utilising DTs at scale. Defined as the cyber-physical integration of data between an asset and its virtual counterpart, DT has been identified in literature from an operational standpoint – in addition to monitoring the performance of a built asset. However, this has never been translated into how DTs should be implemented into a project and what responsibilities each project stakeholder holds in the realisation of a DT. What is needed is an approach to translate these requirements into actionable DT dimensions. This paper presents a foundation for an overarching framework specific to the built environment. For the purposes of this research, the UK widely used the Royal Institute of British Architects (RIBA) Plan of Work from 2020 is used as a basis for itemising project stages. The RIBA Plan of Work consists of eight stages designed to inform on the definition, briefing, design, coordination, construction, handover, and use of a built asset. Similar project stages are utilised in other countries; therefore, the recommendations from the interviews presented in this paper are applicable internationally. Simultaneously, there is not a single mainstream software resource that leverages DT abilities. This ambiguity meets an unparalleled ambition from governments and industries worldwide to achieve a national grid of interconnected DTs. For the construction industry to access these benefits, it necessitates a defined starting point. This research aims to provide a comprehensive understanding of the potential applications and ramifications of DT in the context of the built environment. This paper is an integral part of a larger research aimed at developing a conceptual framework for the Architecture, Engineering, and Construction (AEC) sector following a conventional project timeline. Therefore, this paper plays a pivotal role in providing practical insights and a tangible foundation for developing a stage-by-stage approach to assimilate the potential of DT within the built environment. First, the research focuses on a review of relevant literature, albeit acknowledging the inherent constraint of limited sources available. Secondly, a qualitative study compiling the views of 14 DT experts is presented, concluding with an inductive analysis of the interview findings - ultimately highlighting the barriers and strengths of DT in the context of framework development. As parallel developments aim to progress net-zero-centred design and improve project efficiencies across the built environment, the limited resources available to support DTs should be leveraged to propel the industry to reach its digitalisation era, in which AEC stakeholders have a fundamental role in understanding this, from the earliest stages of a project.Keywords: digital twins, decision-making, design, net-zero, built environment
Procedia PDF Downloads 1238562 Teachers’ Protective Factors of Resilience Scale: Factorial Structure, Validity and Reliability Issues
Authors: Athena Daniilidou, Maria Platsidou
Abstract:
Recently developed scales addressed -specifically- teachers’ resilience. Although they profited from the field, they do not include some of the critical protective factors of teachers’ resilience identified in the literature. To address this limitation, we aimed at designing a more comprehensive scale for measuring teachers' resilience which encompasses various personal and environmental protective factors. To this end, two studies were carried out. In Study 1, 407 primary school teachers were tested with the new scale, the Teachers’ Protective Factors of Resilience Scale (TPFRS). Similar scales, such as the Multidimensional Teachers’ Resilience Scale and the Teachers’ Resilience Scale), were used to test the convergent validity, while the Maslach Burnout Inventory and the Teachers’ Sense of Efficacy Scale was used to assess the discriminant validity of the new scale. The factorial structure of the TPFRS was checked with confirmatory factor analysis and a good fit of the model to the data was found. Next, item response theory analysis using a two-parameter model (2PL) was applied to check the items within each factor. It revealed that 9 items did not fit the corresponding factors well and they were removed. The final version of the TPFRS includes 29 items, which assess six protective factors of teachers’ resilience: values and beliefs (5 items, α=.88), emotional and behavioral adequacy (6 items, α=.74), physical well-being (3 items, α=.68), relationships within the school environment, (6 items, α=.73) relationships outside the school environment (5 items, α=.84), and the legislative framework of education (4 items, α=.83). Results show that it presents a satisfactory convergent and discriminant validity. Study 2, in which 964 primary and secondary school teachers were tested, confirmed the factorial structure of the TPFRS as well as its discriminant validity, which was tested with the Schutte Emotional Intelligence Scale-Short Form. In conclusion, our results confirmed that the TPFRS is a valid instrument for assessing teachers' protective factors of resilience and it can be safely used in future research and interventions in the teaching profession. In conclusion, our results showed that the TPFRS is a new multi-dimensional instrument valid for assessing teachers' protective factors of resilience and it can be safely used in future research and interventions in the teaching profession.Keywords: resilience, protective factors, teachers, item response theory
Procedia PDF Downloads 998561 Effectiveness of Breathing Training Program on Quality of Life and Depression Among Hemodialysis Patients: Quasi‐Experimental Study
Authors: Hayfa Almutary, Noof Eid Al Shammari
Abstract:
Aim: The management of depression in patients undergoing hemodialysis remains challenging. The aim of this study was to evaluate the effectiveness of a breathing training program on quality of life and depression among patients on hemodialysis. Design: A one-group pretest-posttest quasi-experimental design was used. Methods: Data were collected from hemodialysis units at three dialysis centers. Initial baseline data were collected, and a breathing training program was implemented. The breathing training program included three types of breathing exercises. The impact of the intervention on outcomes was measured using both the Kidney Disease Quality of Life Short Version and the Beck Depression Inventory-Second Edition from the same participants. The participants were asked to perform the breathing training program three times a day for 30 days. Results: The mean age of the patients was 52.1 (SD:15.0), with nearly two-thirds of them being male (63.4%). Participants who were undergoing hemodialysis for 1–4 years constituted the largest number of the sample (46.3%), and 17.1% of participants had visited a psychiatric clinic 1-3 times. The results show that the breathing training program improved overall quality of life and reduced symptoms and problems. In addition, a significant decrease in the overall depression score was observed after implementing the intervention. Conclusions: The breathing training program is a non-pharmacological intervention that has proven visible effectiveness in hemodialysis. This study demonstrated that using breathing exercises reduced depression levels and improved quality of life. The integration of this intervention in dialysis units to manage psychological issues will offer a simple, safe, easy, and inexpensive intervention. Future research should compare the effectiveness of various breathing exercises in hemodialysis patients using longitudinal studies. Impact: As a safety precaution, nurses should initially use non-pharmacological interventions, such as a breathing training program, to treat depression in those undergoing hemodialysis.Keywords: breathing training program, depression, exercise, quality of life, hemodialysis
Procedia PDF Downloads 868560 Review on the Role of Sustainability Techniques in Development of Green Building
Authors: Ubaid Ur Rahman, Waqar Younas, Sooraj Kumar Chhabira
Abstract:
Environmentally sustainable building construction has experienced significant growth during the past 10 years at international level. This paper shows that the conceptual framework adopts sustainability techniques in construction to develop environment friendly building called green building. Waste occurs during the different construction phases which causes the environmental problems like, deposition of waste on ground surface creates major problems such as bad smell. It also gives birth to different health diseases and produces toxic waste agent which is specifically responsible for making soil infertile. Old recycled building material is used in the construction of new building. Sustainable construction is economical and saves energy sources. Sustainable construction is the major responsibility of designer and project manager. The designer has to fulfil the client demands while keeping the design environment friendly. Project manager has to deliver and execute sustainable construction according to sustainable design. Steel is the most appropriate sustainable construction material. It is more durable and easily recyclable. Steel occupies less area and has more tensile and compressive strength than concrete, making it a better option for sustainable construction as compared to other building materials. New technology like green roof has made the environment pleasant, and has reduced the construction cost. It minimizes economic, social and environmental issues. This paper presents an overview of research related to the material use of green building and by using this research recommendation are made which can be followed in the construction industry. In this paper, we go through detailed analysis on construction material. By making suitable adjustments to project management practices it is shown that a green building improves the cost efficiency of the project, makes it environmental friendly and also meets future generation demands.Keywords: sustainable construction, green building, recycled waste material, environment
Procedia PDF Downloads 2458559 Building Information Management Advantages, Adaptation, and Challenges of Implementation in Kabul Metropolitan Area
Authors: Mohammad Rahim Rahimi, Yuji Hoshino
Abstract:
Building Information Management (BIM) at recent years has widespread consideration on the Architecture, Engineering and Construction (AEC). BIM has been bringing innovation in AEC industry and has the ability to improve the construction industry with high quality, reduction time and budget of project. Meanwhile, BIM support model and process in AEC industry, the process include the project time cycle, estimating, delivery and generally the way of management of project but not limited to those. This research carried the BIM advantages, adaptation and challenges of implementation in Kabul region. Capital Region Independence Development Authority (CRIDA) have responsibilities to implement the development projects in Kabul region. The method of study were considers on advantages and reasons of BIM performance in Afghanistan based on online survey and data. Besides that, five projects were studied, the reason of consideration were many times design revises and changes. Although, most of the projects had problems regard to designing and implementation stage, hence in canal project was discussed in detail with the main reason of problems. Which were many time changes and revises due to the lack of information, planning, and management. In addition, two projects based on BIM utilization in Japan were also discussed. The Shinsuizenji Station and Oita River dam projects. Those are implemented and implementing consequently according to the BIM requirements. The investigation focused on BIM usage, project implementation process. Eventually, the projects were the comparison with CRIDA and BIM utilization in Japan. The comparison will focus on the using of the model and the way of solving the problems based upon on the BIM. In conclusion, that BIM had the capacity to prevent many times design changes and revises. On behalf of achieving those objectives are required to focus on data management and sharing, BIM training and using new technology.Keywords: construction information management, implementation and adaptation of BIM, project management, developing countries
Procedia PDF Downloads 1298558 Candida antartica Lipase Assisted Enrichment of n-3 PUFA in Indian Sardine Oil
Authors: Prasanna Belur, P. R. Ashwini, Sampath Charanyaa, I. Regupathi
Abstract:
Indian oil sardine (Sardinella longiceps) are one of the richest and cheapest sources of n-3 polyunsaturated fatty acids (n-3 PUFA) such as Eicosapentaenoic acid (EPA) and Docosahexaenoic acid (DHA). The health benefits conferred by n-3 PUFA upon consumption, in the prevention and treatment of coronary, neuromuscular, immunological disorders and allergic conditions are well documented. Natural refined Indian Sardine oil generally contain about 25% (w/w) n-3 PUFA along with various unsaturated and saturated fatty acids in the form of mono, di, and triglycerides. Having high concentration of n-3 PUFA content in the glyceride form is most desirable for human consumption to avail maximum health benefits. Thus, enhancing the n-3 PUFA content while retaining it in the glyceride form with green technology is the need of the hour. In this study, refined Indian Sardine oil was subjected to selective hydrolysis by Candida antartica lipase to enhance n-3 PUFA content. The degree of hydrolysis and enhancement of n-3 PUFA content was estimated by determining acid value, Iodine value, EPA and DHA content (by Gas Chromatographic methods after derivitization) before and after hydrolysis. Various reaction parameters such as pH, temperature, enzyme load, lipid to aqueous phase volume ratio and incubation time were optimized by conducting trials with one parameter at a time approach. Incubating enzyme solution with refined sardine oil with a volume ratio of 1:1, at pH 7.0, for 60 minutes at 50 °C, with an enzyme load of 60 mg/ml was found to be optimum. After enzymatic treatment, the oil was subjected to refining to remove free fatty acids and moisture content using previously optimized refining technology. Enzymatic treatment at the optimal conditions resulted in 12.11 % enhancement in Degree of hydrolysis. Iodine number had increased by 9.7 % and n-3 PUFA content was enhanced by 112 % (w/w). Selective enhancement of n-3 PUFA glycerides, eliminating saturated and unsaturated fatty acids from the oil using enzyme is an interesting preposition as this technique is environment-friendly, cost effective and provide natural source of n-3 PUFA rich oil.Keywords: Candida antartica, lipase, n-3 polyunsaturated fatty acids, sardine oil
Procedia PDF Downloads 2338557 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 928556 Molecular Design and Synthesis of Heterocycles Based Anticancer Agents
Authors: Amna J. Ghith, Khaled Abu Zid, Khairia Youssef, Nasser Saad
Abstract:
Backgrounds: The multikinase and vascular endothelial growth factor (VEGF) receptor inhibitors interrupt the pathway by which angiogenesis becomes established and promulgated, resulting in the inadequate nourishment of metastatic disease. VEGFR-2 has been the principal target of anti-angiogenic therapies. We disclose the new thieno pyrimidines as inhibitors of VEGFR-2 designed by a molecular modeling approach with increased synergistic activity and decreased side effects. Purpose: 2-substituted thieno pyrimidines are designed and synthesized with anticipated anticancer activity based on its in silico molecular docking study that supports the initial pharmacophoric hypothesis with a same binding mode of interaction at the ATP-binding site of VEGFR-2 (PDB 2QU5) with high docking score. Methods: A series of compounds were designed using discovery studio 4.1/CDOCKER with a rational that mimic the pharmacophoric features present in the reported active compounds that targeted VEGFR-2. An in silico ADMET study was also performed to validate the bioavailability of the newly designed compounds. Results: The Compounds to be synthesized showed interaction energy comparable to or within the range of the benzimidazole inhibitor ligand when docked with VEGFR-2. ADMET study showed comparable results most of the compounds showed absorption within (95-99) zone varying according to different substitutions attached to thieno pyrimidine ring system. Conclusions: A series of 2-subsituted thienopyrimidines are to be synthesized with anticipated anticancer activity and according to docking study structure requirement for the design of VEGFR-2 inhibitors which can act as powerful anticancer agents.Keywords: docking, discovery studio 4.1/CDOCKER, heterocycles based anticancer agents, 2-subsituted thienopyrimidines
Procedia PDF Downloads 2468555 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins
Authors: Manju Kanu, Subrata Sinha, Surabhi Johari
Abstract:
Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.Keywords: epitope, b cell, immunogenicity, ebola
Procedia PDF Downloads 3148554 An Approach in Design of Large-Scale Hydrogen Plants
Authors: Hamidreza Sahaleh
Abstract:
Because of the stringent prerequisite of low sulfur and heavier raw oil feedstock more hydrogen will be devoured in the refineries. Specifically if huge scale limits are the reaction to an expanded hydrogen request, certain configuration and building background are obliged with, which will be depicted in this paper with an illustration. Chosen procedure plan prerequisite will be recorded and portrayed in agreement to the flowsheet. Also, a determination of imaginative outline elements, similar to process condensate reuse, safe reformer start up and prerequisites will be highlighted.Keywords: low sulfur, raw oil, refineries, flowsheet
Procedia PDF Downloads 2968553 Strategies for Public Space Utilization
Authors: Ben Levenger
Abstract:
Social life revolves around a central meeting place or gathering space. It is where the community integrates, earns social skills, and ultimately becomes part of the community. Following this premise, public spaces are one of the most important spaces that downtowns offer, providing locations for people to be witnessed, heard, and most importantly, seamlessly integrate into the downtown as part of the community. To facilitate this, these local spaces must be envisioned and designed to meet the changing needs of a downtown, offering a space and purpose for everyone. This paper will dive deep into analyzing, designing, and implementing public space design for small plazas or gathering spaces. These spaces often require a detailed level of study, followed by a broad stroke of design implementation, allowing for adaptability. This paper will highlight how to assess needs, define needed types of spaces, outline a program for spaces, detail elements of design to meet the needs, assess your new space, and plan for change. This study will provide participants with the necessary framework for conducting a grass-roots-level assessment of public space and programming, including short-term and long-term improvements. Participants will also receive assessment tools, sheets, and visual representation diagrams. Urbanism, for the sake of urbanism, is an exercise in aesthetic beauty. An economic improvement or benefit must be attained to solidify these efforts' purpose further and justify the infrastructure or construction costs. We will deep dive into case studies highlighting economic impacts to ground this work in quantitative impacts. These case studies will highlight the financial impact on an area, measuring the following metrics: rental rates (per sq meter), tax revenue generation (sales and property), foot traffic generation, increased property valuations, currency expenditure by tenure, clustered development improvements, cost/valuation benefits of increased density in housing. The economic impact results will be targeted by community size, measuring in three tiers: Sub 10,000 in population, 10,001 to 75,000 in population, and 75,000+ in population. Through this classification breakdown, the participants can gauge the impact in communities similar to their work or for which they are responsible. Finally, a detailed analysis of specific urbanism enhancements, such as plazas, on-street dining, pedestrian malls, etc., will be discussed. Metrics that document the economic impact of each enhancement will be presented, aiding in the prioritization of improvements for each community. All materials, documents, and information will be available to participants via Google Drive. They are welcome to download the data and use it for their purposes.Keywords: downtown, economic development, planning, strategic
Procedia PDF Downloads 818552 The Review and Contribution of Taiwan Government Policies on Environmental Impact Assessment to Water Recycling
Authors: Feng-Ming Fan, Xiu-Hui Wen, Po-Feng Chen, Yi-Ching Tu
Abstract:
Because of inborn natural conditions and man-made sabotage, the water resources insufficient phenomenon in Taiwan is a very important issue needed to face immediately. The regulations and law of water resources protection and recycling are gradually completed now but still lack of specific water recycling effectiveness checking method. The research focused on the industrial parks that already had been certificated with EIA to establish a professional checking system, carry through and forge ahead to contribute one’s bit in water resources sustainable usage. Taiwan Government Policies of Environmental Impact Assessment established in 1994, some development projects were requested to set certain water recycling ratio for water resources effective usage. The water covers and contains everything because all-inclusive companies enter and be stationed. For control the execution status of industrial park water and waste water recycling ratio about EIA commitment effectively, we invited experts and scholars in this filed to discuss with related organs to formulate the policy and audit plan. Besides, call a meeting to set public version water equilibrium diagrams and recycles parameter. We selected nine industrial parks that were requested set certain water recycling ratio in EIA examination stage and then according to the water usage quantity, we audited 340 factories in these industrial parks with spot and documents examination and got fruitful results – the average water usage of unit area per year of all these examined industrial parks is 31,000 tons/hectare/year, the value is just half of Taiwan industries average. It is obvious that the industrial parks with EIA commitment can decrease the water resources consumption effectively. Taiwan government policies of Environmental Impact Assessment took follow though tracking function into consideration at the beginning. The results of this research verify the importance of the implementing with water recycling to save water resources in EIA commitment. Inducing development units to follow EIA commitment to get the balance between environmental protection and economic development is one of the important EIA value.Keywords: Taiwan government policies of environmental impact assessment, water recycling ratio of EIA commitment, water resources sustainable usage, water recycling
Procedia PDF Downloads 2268551 Determination of Temperature Dependent Characteristic Material Properties of Commercial Thermoelectric Modules
Authors: Ahmet Koyuncu, Abdullah Berkan Erdogmus, Orkun Dogu, Sinan Uygur
Abstract:
Thermoelectric modules are integrated to electronic components to keep their temperature in specific values in electronic cooling applications. They can be used in different ambient temperatures. The cold side temperatures of thermoelectric modules depend on their hot side temperatures, operation currents, and heat loads. Performance curves of thermoelectric modules are given at most two different hot surface temperatures in product catalogs. Characteristic properties are required to select appropriate thermoelectric modules in thermal design phase of projects. Generally, manufacturers do not provide characteristic material property values of thermoelectric modules to customers for confidentiality. Common commercial software applied like ANSYS ICEPAK, FloEFD, etc., include thermoelectric modules in their libraries. Therefore, they can be easily used to predict the effect of thermoelectric usage in thermal design. Some software requires only the performance values in different temperatures. However, others like ICEPAK require three temperature-dependent equations for material properties (Seebeck coefficient (α), electrical resistivity (β), and thermal conductivity (γ)). Since the number and the variety of thermoelectric modules are limited in this software, definitions of characteristic material properties of thermoelectric modules could be required. In this manuscript, the method of derivation of characteristic material properties from the datasheet of thermoelectric modules is presented. Material characteristics were estimated from two different performance curves by experimentally and numerically in this study. Numerical calculations are accomplished in ICEPAK by using a thermoelectric module exists in the ICEPAK library. A new experimental setup was established to perform experimental study. Because of similar results of numerical and experimental studies, it can be said that proposed equations are approved. This approximation can be suggested for the analysis includes different type or brand of TEC modules.Keywords: electrical resistivity, material characteristics, thermal conductivity, thermoelectric coolers, seebeck coefficient
Procedia PDF Downloads 1798550 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood
Authors: Randa Alharbi, Vladislav Vyshemirsky
Abstract:
Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)
Procedia PDF Downloads 2028549 War Heritage: Different Perceptions of the Dominant Discourse among Visitors to the “Adem Jashari” Memorial Complex in Prekaz
Authors: Zana Llonçari Osmani, Nita Llonçari
Abstract:
In Kosovo, public rhetoric and popular sentiment position the War of 1998-99 (the war) as central to the formation of contemporary Kosovo's national identity. This period was marked by the forced massive displacement of Kosovo Albanians, the destruction of entire settlements, the loss of family members, and the profound emotional trauma experienced by civilians, particularly those who actively participated in the war as members of the Kosovo Liberation Army (KLA). Amidst these profound experiences, the Prekaz Massacre (The Massacre) is widely regarded as the defining event that preceded the final struggles of 1999 and the long-awaited attainment of independence. This study aims to explore how different visitors perceive the dominant discourse at The Memorial, a site dedicated to commemorating the Prekaz Massacre, and to identify the factors that influence their perceptions. The research employs a comprehensive mixed-method approach, combining online surveys, critical discourse analysis of visitor impressions, and content analysis of media representations. The findings of the study highlight the significant role played by original material remains in shaping visitor perceptions of The Memorial in comparison to the curated symbols and figurative representations interspersed throughout the landscape. While the design elements and physical layout of the memorial undeniably hold significance in conveying the memoryscape, there are notable shortcomings in enhancing the overall visitor experience. Visitors are still primarily influenced by the tangible remnants of the war, suggesting that there is room for improvement in how design elements can more effectively contribute to the memorial's narrative and the collective memory of the Prekaz Massacre.Keywords: critical discourse analysis, memorialisation, national discourse, public rhetoric, war tourism
Procedia PDF Downloads 85