Search results for: integrated design process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26595

Search results for: integrated design process

18345 Comparative Isotherms Studies on Adsorptive Removal of Methyl Orange from Wastewater by Watermelon Rinds and Neem-Tree Leaves

Authors: Sadiq Sani, Muhammad B. Ibrahim

Abstract:

Watermelon rinds powder (WRP) and neem-tree leaves powder (NLP) were used as adsorbents for equilibrium adsorption isotherms studies for detoxification of methyl orange dye (MO) from simulated wastewater. The applicability of the process to various isotherm models was tested. All isotherms from the experimental data showed excellent linear reliability (R2: 0.9487-0.9992) but adsorptions onto WRP were more reliable (R2: 0.9724-0.9992) than onto NLP (R2: 0.9487-0.9989) except for Temkin’s Isotherm where reliability was better onto NLP (R2: 0.9937) than onto WRP (R2: 0.9935). Dubinin-Radushkevich’s monolayer adsorption capacities for both WRP and NLP (qD: 20.72 mg/g, 23.09 mg/g) were better than Langmuir’s (qm: 18.62 mg/g, 21.23 mg/g) with both capacities higher for adsorption onto NLP (qD: 23.09 mg/g; qm: 21.23 mg/g) than onto WRP (qD: 20.72 mg/g; qm: 18.62 mg/g). While values for Langmuir’s separation factor (RL) for both adsorbents suggested unfavourable adsorption processes (RL: -0.0461, -0.0250), Freundlich constant (nF) indicated favourable process onto both WRP (nF: 3.78) and NLP (nF: 5.47). Adsorption onto NLP had higher Dubinin-Radushkevich’s mean free energy of adsorption (E: 0.13 kJ/mol) than WRP (E: 0.08 kJ/mol) and Temkin’s heat of adsorption (bT) was better onto NLP (bT: -0.54 kJ/mol) than onto WRP (bT: -0.95 kJ/mol) all of which suggested physical adsorption.

Keywords: adsorption isotherms, methyl orange, neem leaves, watermelon rinds

Procedia PDF Downloads 273
18344 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement

Authors: Shibo Wei, Ting Jiang

Abstract:

Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).

Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR

Procedia PDF Downloads 200
18343 Variation of Warp and Binder Yarn Tension across the 3D Weaving Process and its Impact on Tow Tensile Strength

Authors: Reuben Newell, Edward Archer, Alistair McIlhagger, Calvin Ralph

Abstract:

Modern industry has developed a need for innovative 3D composite materials due to their attractive material properties. Composite materials are composed of a fibre reinforcement encased in a polymer matrix. The fibre reinforcement consists of warp, weft and binder yarns or tows woven together into a preform. The mechanical performance of composite material is largely controlled by the properties of the preform. As a result, the bulk of recent textile research has been focused on the design of high-strength preform architectures. Studies looking at optimisation of the weaving process have largely been neglected. It has been reported that yarns experience varying levels of damage during weaving, resulting in filament breakage and ultimately compromised composite mechanical performance. The weaving parameters involved in causing this yarn damage are not fully understood. Recent studies indicate that poor yarn tension control may be an influencing factor. As tension is increased, the yarn-to-yarn and yarn-to-weaving-equipment interactions are heightened, maximising damage. The correlation between yarn tension variation and weaving damage severity has never been adequately researched or quantified. A novel study is needed which accesses the influence of tension variation on the mechanical properties of woven yarns. This study has looked to quantify the variation of yarn tension throughout weaving and sought to link the impact of tension to weaving damage. Multiple yarns were randomly selected, and their tension was measured across the creel and shedding stages of weaving, using a hand-held tension meter. Sections of the same yarn were subsequently cut from the loom machine and tensile tested. A comparison study was made between the tensile strength of pristine and tensioned yarns to determine the induced weaving damage. Yarns from bobbins at the rear of the creel were under the least amount of tension (0.5-2.0N) compared to yarns positioned at the front of the creel (1.5-3.5N). This increase in tension has been linked to the sharp turn in the yarn path between bobbins at the front of the creel and creel I-board. Creel yarns under the lower tension suffered a 3% loss of tensile strength, compared to 7% for the greater tensioned yarns. During shedding, the tension on the yarns was higher than in the creel. The upper shed yarns were exposed to a decreased tension (3.0-4.5N) compared to the lower shed yarns (4.0-5.5N). Shed yarns under the lower tension suffered a 10% loss of tensile strength, compared to 14% for the greater tensioned yarns. Interestingly, the most severely damaged yarn was exposed to both the largest creel and shedding tensions. This study confirms for the first time that yarns under a greater level of tension suffer an increased amount of weaving damage. Significant variation of yarn tension has been identified across the creel and shedding stages of weaving. This leads to a variance of mechanical properties across the woven preform and ultimately the final composite part. The outcome from this study highlights the need for optimised yarn tension control during preform manufacture to minimize yarn-induced weaving damage.

Keywords: optimisation of preform manufacture, tensile testing of damaged tows, variation of yarn weaving tension, weaving damage

Procedia PDF Downloads 236
18342 Another Beautiful Sounds: Building the Memory of Sound of Peddling in Beijing with Digital Technology

Authors: Dan Wang, Qing Ma, Xiaodan Wang, Tianjiao Qi

Abstract:

The sound of peddling in Beijing, also called “yo-heave-ho” or “cry of one's ware”, is a unique folk culture and usually found in Beijing hutong. For the civilians in Beijing, sound of peddling is part of their childhood. And for those who love the traditional culture of Beijing, it is an old song singing the local conditions and customs of the ancient city. For example, because of his great appreciation, the British poet Osbert Stewart once put sound of peddling which he had heard in Beijing as a street orchestra performance in the article named "Beijing's sound and color".This research aims to collect and integrate the voice/photo resources and historical materials of sound concerning peddling in Beijing by digital technology in order to protect the intangible cultural heritage and pass on the city memory. With the goal in mind, the next stage is to collect and record all the materials and resources based on the historical documents study and interviews with civilians or performers. Then set up a metadata scheme (which refers to the domestic and international standards such as "Audio Data Processing Standards in the National Library", DC, VRA, and CDWA, etc.) to describe, process and organize the sound of peddling into a database. In order to fully show the traditional culture of sound of peddling in Beijing, web design and GIS technology are utilized to establish a website and plan holding offline exhibitions and events for people to simulate and learn the sound of peddling by using VR/AR technology. All resources are opened to the public and civilians can share the digital memory through not only the offline experiential activities, but also the online interaction. With all the attempts, a multi-media narrative platform has been established to multi-dimensionally record the sound of peddling in old Beijing with text, images, audio, video and so on.

Keywords: sound of peddling, GIS, metadata scheme, VR/AR technology

Procedia PDF Downloads 304
18341 Further Study of Mechanism of Contrasting Charge Transport Properties for Phenyl and Thienyl Substituent Organic Semiconductors

Authors: Yanan Zhu

Abstract:

Based on the previous work about the influence mechanism of the mobility difference of phenyl and thienyl substituent semiconductors, we have made further exploration towards to design high-performance organic thin-film transistors. The substituent groups effect plays a significant role in materials properties and device performance as well. For the theoretical study, simulation of materials property and crystal packing can supply scientific guidance for materials synthesis in experiments. This time, we have taken the computational methods to design a new material substituent with furan groups, which are the potential to be used in organic thin-film transistors and organic single-crystal transistors. The reorganization energy has been calculated and much lower than 2,6-diphenyl anthracene (DPAnt), which performs large mobility as more than 30 cm²V⁻¹s⁻¹. Moreover, the other important parameter, charge transfer integral is larger than DPAnt, which suggested the furan substituent material may get a much better charge transport data. On the whole, the mechanism investigation based on phenyl and thienyl assisted in designing novel materials with furan substituent, which is predicted to be an outperformed organic field-effect transistors.

Keywords: theoretical calculation, mechanism, mobility, organic transistors

Procedia PDF Downloads 137
18340 Thermo-Oxidative Degradation of Esterified Starch (with Lauric Acid) -Plastic Composite Assembled with Pro-Oxidants and Elastomers

Authors: R. M. S. Sachini Amararathne

Abstract:

This research is striving to develop a thermo degradable starch plastic compound/ masterbatch for industrial packaging applications. A native corn starch-modified with an esterification reaction of lauric acid is melt blent with an unsaturated elastomer (styrene-butadiene-rubber/styrene-butadiene-styrene). A trace amount of metal salt is added into the internal mixer to study the effect of pro-oxidants in a thermo oxidative environment. Then the granulated polymer composite which is consisted with 80-86% of polyolefin (LLDP/LDPE/PP) as the pivotal agent; is extruded with processing aids, antioxidants and some other additives in a co-rotating twin-screw extruder. The pelletized composite is subjected to compression molding/ Injection molding or blown film extrusion processes to acquire the samples/specimen for tests. The degradation process is explicated by analyzing the results of fourier transform infrared spectroscopy (FTIR) measurements, thermo oxidative aging studies (placing the dumb-bell specimen in an air oven at 70 °C for four weeks of exposure.) governed by tensile and impact strength test reports. Furthermore, the samples were elicited into manifold outdoors to inspect the degradation process. This industrial process is implemented to reduce the volume of fossil-based garbage by achieving the biodegradability and compostability in the natural cycle. Hence the research leads to manufacturing a degradable plastic packaging compound which is now available in the Sri Lankan market.

Keywords: blown film extrusion, compression moulding, polyolefin, pro-oxidant, styrene-butadine-rubber, styrene-butadiene-styrene, thermo oxidative aging, unsaturated elastomer

Procedia PDF Downloads 95
18339 Characterization and Pcr Detection of Selected Strains of Psychrotrophic Bacteria Isolated From Raw Milk

Authors: Kidane workelul, Li xu, Xiaoyang Pang, Jiaping Lv

Abstract:

Dairy products are exceptionally ideal media for the growth of microorganisms because of their high nutritional content. There are several ways that milk might get contaminated throughout the milking process, including how the raw milk is transported and stored, as well as how long it is kept before being processed. Psychrotrophic bacteria are among the one which can deteriorate the quality of milk mainly their heat resistance proteas and lipase enzyme. For this research purpose 8 selected strains of Psychrotrophic bacteria (Entrococcus hirae, Pseudomonas fluorescens, Pseudomonas azotoformans, Pseudomonas putida, Exiguobacterium indicum, Pseudomonas paralactice, Acinetobacter indicum, Serratia liquefacients)are chosen and try to determine their characteristics based on the research methodology protocol. Thus, the 8 selected strains are cultured, plated incubate, extracted their genomic DNA and genome DNA was amplified, the purpose of the study was to identify their Psychrotrophic properties, lipase hydrolysis positive test, their optimal incubation temperature, designed primer using the noble strain P,flourescens conserved region area in target with lipA gene, optimized primer specificity as well as sensitivity and PCR detection for lipase positive strains using the design primers. Based on the findings both the selected 8 strains isolated from stored raw milk are Psychrotrophic bacteria, 6 of the selected strains except the 2 strains are positive for lipase hydrolysis, their optimal temperature is 20 to 30 OC, the designed primer specificity is very accurate and amplifies for those strains only with lipase positive but could not amplify for the others. Thus, the result is promising and could help in detecting the Psychrotrophic bacteria producing heat resistance enzymes (lipase) at early stage before the milk is processed and this will safe production loss for the dairy industry.

Keywords: dairy industry, heat-resistant, lipA, milk, primer and psychrotrophic

Procedia PDF Downloads 64
18338 Communication Barriers in Disaster Risk Management

Authors: Pooja Pandey

Abstract:

The role of communication plays an integral part in the management of any disaster, whether natural or human-induced, both require effective and strategic delivery of information. The way any information is conveyed carries the most weight while dealing with the disaster. Hence, integrating communication strategies in disaster risk management (DRM) are extensively acknowledged however, these integration and planning are missing from the practical books. Researchers are continuously exploring integrated DRM and have established substantial vents between research and implementation of the strategies (gaps between science and policy). For this reason, this paper reviews the communication barriers that obstruct effective management of the disaster. Communication between first responders (government agencies, police, medical services) and the public (people directly affected by the disaster) is most critical and lacks proper delivery during a disaster. And these challenges can only be resolved if the foundation of the problem is properly dealt with, which is resolving the issues within the organizations. Through this study, it was found that it is necessary to build the communication gap between the organizations themselves as most of the hindrances occur during the mitigation, preparedness, response and recovery phase of the disaster. The study is concluded with the main aim to review the communication barriers within and at the organizational, technological, and social levels that impact effective DRM. In the end, some suggestions are made to strengthen the knowledge for future improvement in communication between the responders and their organizations.

Keywords: communication, organization, barriers, first responders, disaster risk management

Procedia PDF Downloads 300
18337 Automated Ultrasound Carotid Artery Image Segmentation Using Curvelet Threshold Decomposition

Authors: Latha Subbiah, Dhanalakshmi Samiappan

Abstract:

In this paper, we propose denoising Common Carotid Artery (CCA) B mode ultrasound images by a decomposition approach to curvelet thresholding and automatic segmentation of the intima media thickness and adventitia boundary. By decomposition, the local geometry of the image, its direction of gradients are well preserved. The components are combined into a single vector valued function, thus removes noise patches. Double threshold is applied to inherently remove speckle noise in the image. The denoised image is segmented by active contour without specifying seed points. Combined with level set theory, they provide sub regions with continuous boundaries. The deformable contours match to the shapes and motion of objects in the images. A curve or a surface under constraints is developed from the image with the goal that it is pulled into the necessary features of the image. Region based and boundary based information are integrated to achieve the contour. The method treats the multiplicative speckle noise in objective and subjective quality measurements and thus leads to better-segmented results. The proposed denoising method gives better performance metrics compared with other state of art denoising algorithms.

Keywords: curvelet, decomposition, levelset, ultrasound

Procedia PDF Downloads 340
18336 Airborne Pollutants and Lung Surfactant: Biophysical Impacts of Surface Oxidation Reactions

Authors: Sahana Selladurai, Christine DeWolf

Abstract:

Lung surfactant comprises a lipid-protein film that coats the alveolar surface and serves to prevent alveolar collapse upon repeated breathing cycles. Exposure of lung surfactant to high concentrations of airborne pollutants, for example tropospheric ozone in smog, can chemically modify the lipid and protein components. These chemical changes can impact the film functionality by decreasing the film’s collapse pressure (minimum surface tension attainable), altering it is mechanical and flow properties and modifying lipid reservoir formation essential for re-spreading of the film during the inhalation process. In this study, we use Langmuir monolayers spread at the air-water interface as model membranes where the compression and expansion of the film mimics the breathing cycle. The impact of ozone exposure on model lung surfactant films is measured using a Langmuir film balance, Brewster angle microscopy and a pendant drop tensiometer as a function of film and sub-phase composition. The oxidized films are analyzed using mass spectrometry where lipid and protein oxidation products are observed. Oxidation is shown to reduce surface activity, alter line tension (and film morphology) and in some cases visibly reduce the viscoelastic properties of the film when compared to controls. These reductions in functionality of the films are highly dependent on film and sub-phase composition, where for example, the effect of oxidation is more pronounced when using a physiologically relevant buffer as opposed to water as the sub-phase. These findings can lead to a better understanding on the impact of continuous exposure to high levels of ozone on the mechanical process of breathing, as well as understanding the roles of certain lung surfactant components in this process.

Keywords: lung surfactant, oxidation, ozone, viscoelasticity

Procedia PDF Downloads 311
18335 Examination of Woody Taxa in Urban Parks in the Context of Climate Change: Resat Oyal Kulturpark and Hudavendigar Urban Park Samples

Authors: Murat Zencirkıran, Elvan Ender

Abstract:

Climate change, which has become effective on a global scale, is accompanied by an increase in negative conditions for human, plant and animal life. Especially these negative conditions (drought, warming, glowing, etc.) are felt more rapidly in urban life and affect the sustainability of green areas which are of great importance in terms of life comfort. In this context, the choice of woody taxa used in the design and design of green spaces in the city increase one more time. Within the scope of this study, two of four urban parks located in the city center of Bursa province were selected and evaluated for woody taxa. Urban parks have been identified as the oldest and newest urban park in Bursa, and it has been tried to emphasize the differences that may exist over time. It was determined that 54 woody taxa took place in Resat Oyal Kulturpark and 76 woody taxa in Hudavendigar Urban Park. These taxa have been evaluated in terms of water consumption and ecological tolerances by taking into account climate change, and suggestions have been developed against possible problems.

Keywords: ecological hardiness, urban park, water consumption, woody plants

Procedia PDF Downloads 297
18334 Analysis of Patient No-Shows According to Health Conditions

Authors: Sangbok Lee

Abstract:

There has been much effort on process improvement for outpatient clinics to provide quality and acute care to patients. One of the efforts is no-show analysis or prediction. This work analyzes patient no-shows along with patient health conditions. The health conditions refer to clinical symptoms that each patient has, out of the followings; hyperlipidemia, diabetes, metastatic solid tumor, dementia, chronic obstructive pulmonary disease, hypertension, coronary artery disease, myocardial infraction, congestive heart failure, atrial fibrillation, stroke, drug dependence abuse, schizophrenia, major depression, and pain. A dataset from a regional hospital is used to find the relationship between the number of the symptoms and no-show probabilities. Additional analysis reveals how each symptom or combination of symptoms affects no-shows. In the above analyses, cross-classification of patients by age and gender is carried out. The findings from the analysis will be used to take extra care to patients with particular health conditions. They will be forced to visit clinics by being informed about their health conditions and possible consequences more clearly. Moreover, this work will be used in the preparation of making institutional guidelines for patient reminder systems.

Keywords: healthcare system, no show analysis, process improvment, statistical data analysis

Procedia PDF Downloads 233
18333 Impact of Legs Geometry on the Efficiency of Thermoelectric Devices

Authors: Angel Fabian Mijangos, Jaime Alvarez Quintana

Abstract:

Key concepts like waste heat recycling or waste heat recovery are the basic ideas in thermoelectricity so as to the design the newest solid state sources of energy for a stable supply of electricity and environmental protection. According to several theoretical predictions; at device level, the geometry and configuration of the thermoelectric legs are crucial in the thermoelectric performance of the thermoelectric modules. Thus, in this work, it has studied the geometry effect of legs on the thermoelectric figure of merit ZT of the device. First, asymmetrical legs are proposed in order to reduce the overall thermal conductance of the device so as to increase the temperature gradient in the legs, as well as by harnessing the Thomson effect, which is generally neglected in conventional symmetrical thermoelectric legs. It has been developed a novel design of a thermoelectric module having asymmetrical legs, and by first time it has been validated experimentally its thermoelectric performance by realizing a proof-of-concept device which shows to have almost twofold the thermoelectric figure of merit as compared to conventional one. Moreover, it has been also varied the length of thermoelectric legs in order to analyze its effect on the thermoelectric performance of the device. Along with this, it has studied the impact of contact resistance in these systems. Experimental results show that device architecture can improve up to twofold the thermoelectric performance of the device.

Keywords: asymmetrical legs, heat recovery, heat recycling, thermoelectric module, Thompson effect

Procedia PDF Downloads 241
18332 Structural Invertibility and Optimal Sensor Node Placement for Error and Input Reconstruction in Dynamic Systems

Authors: Maik Kschischo, Dominik Kahl, Philipp Wendland, Andreas Weber

Abstract:

Understanding and modelling of real-world complex dynamic systems in biology, engineering and other fields is often made difficult by incomplete knowledge about the interactions between systems states and by unknown disturbances to the system. In fact, most real-world dynamic networks are open systems receiving unknown inputs from their environment. To understand a system and to estimate the state dynamics, these inputs need to be reconstructed from output measurements. Reconstructing the input of a dynamic system from its measured outputs is an ill-posed problem if only a limited number of states is directly measurable. A first requirement for solving this problem is the invertibility of the input-output map. In our work, we exploit the fact that invertibility of a dynamic system is a structural property, which depends only on the network topology. Therefore, it is possible to check for invertibility using a structural invertibility algorithm which counts the number of node disjoint paths linking inputs and outputs. The algorithm is efficient enough, even for large networks up to a million nodes. To understand structural features influencing the invertibility of a complex dynamic network, we analyze synthetic and real networks using the structural invertibility algorithm. We find that invertibility largely depends on the degree distribution and that dense random networks are easier to invert than sparse inhomogeneous networks. We show that real networks are often very difficult to invert unless the sensor nodes are carefully chosen. To overcome this problem, we present a sensor node placement algorithm to achieve invertibility with a minimum set of measured states. This greedy algorithm is very fast and also guaranteed to find an optimal sensor node-set if it exists. Our results provide a practical approach to experimental design for open, dynamic systems. Since invertibility is a necessary condition for unknown input observers and data assimilation filters to work, it can be used as a preprocessing step to check, whether these input reconstruction algorithms can be successful. If not, we can suggest additional measurements providing sufficient information for input reconstruction. Invertibility is also important for systems design and model building. Dynamic models are always incomplete, and synthetic systems act in an environment, where they receive inputs or even attack signals from their exterior. Being able to monitor these inputs is an important design requirement, which can be achieved by our algorithms for invertibility analysis and sensor node placement.

Keywords: data-driven dynamic systems, inversion of dynamic systems, observability, experimental design, sensor node placement

Procedia PDF Downloads 150
18331 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.

Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter

Procedia PDF Downloads 330
18330 Study of Early Diagnosis of Oral Cancer by Non-invasive Saliva-On-Chip Device: A Microfluidic Approach

Authors: Ragini Verma, J. Ponmozhi

Abstract:

The oral cavity is home to a wide variety of microorganisms that lead to various diseases and even oral cancer. Despite advancements in the diagnosis and detection at the initial phase, the situation hasn’t improved much. Saliva-on-a-chip is an innovative point-of-care platform for early diagnosis of oral cancer and other oral diseases in live and dead cells using a microfluidic device with a current perspective. Some of the major challenges, like real-time imaging of the oral cancer microbes, high throughput values, obtaining a high spatiotemporal resolution, etc. were faced by the scientific community. Integrated microfluidics and microscopy provide powerful approaches to studying the dynamics of oral pathology, microbe interaction, and the oral microenvironment. Here we have developed a saliva-on-chip (salivary microbes) device to monitor the effect on oral cancer. Adhesion of cancer-causing F. nucleatum; subsp. Nucleatum and Prevotella intermedia in the device was observed. We also observed a significant reduction in the oral cancer growth rate when mortality and morbidity were induced. These results show that this approach has the potential to transform the oral cancer and early diagnosis study.

Keywords: microfluidic device, oral cancer microbes, early diagnosis, saliva-on-chip

Procedia PDF Downloads 101
18329 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 164
18328 Automated Vehicle Traffic Control Tower: A Solution to Support the Next Level Automation

Authors: Xiaoyun Zhao, Rami Darwish, Anna Pernestål

Abstract:

Automated vehicles (AVs) have the potential to enhance road capacity, improving road safety and traffic efficiency. Research and development on AVs have been going on for many years. However, when the complicated traffic rules and real situations interacted, AVs fail to make decisions on contradicting situations, and are not able to have control in all conditions due to highly dynamic driving scenarios. This limits AVs’ usage and restricts the full potential benefits that they can bring. Furthermore, regulations, infrastructure development, and public acceptance cannot keep up at the same pace as technology breakthroughs. Facing these challenges, this paper proposes automated vehicle traffic control tower (AVTCT) acting as a safe, efficient and integrated solution for AV control. It introduces a concept of AVTCT for control, management, decision-making, communication and interaction with various aspects in transportation. With the prototype demonstrations and simulations, AVTCT has the potential to overcome the control challenges with AVs and can facilitate AV reaching their full potential. Possible functionalities, benefits as well as challenges of AVTCT are discussed, which set the foundation for the conceptual model, simulation and real application of AVTCT.

Keywords: automated vehicle, connectivity and automation, intelligent transport system, traffic control, traffic safety

Procedia PDF Downloads 138
18327 Covariate-Adjusted Response-Adaptive Designs for Semi-Parametric Survival Responses

Authors: Ayon Mukherjee

Abstract:

Covariate-adjusted response-adaptive (CARA) designs use the available responses to skew the treatment allocation in a clinical trial in towards treatment found at an interim stage to be best for a given patient's covariate profile. Extensive research has been done on various aspects of CARA designs with the patient responses assumed to follow a parametric model. However, ranges of application for such designs are limited in real-life clinical trials where the responses infrequently fit a certain parametric form. On the other hand, robust estimates for the covariate-adjusted treatment effects are obtained from the parametric assumption. To balance these two requirements, designs are developed which are free from distributional assumptions about the survival responses, relying only on the assumption of proportional hazards for the two treatment arms. The proposed designs are developed by deriving two types of optimum allocation designs, and also by using a distribution function to link the past allocation, covariate and response histories to the present allocation. The optimal designs are based on biased coin procedures, with a bias towards the better treatment arm. These are the doubly-adaptive biased coin design (DBCD) and the efficient randomized adaptive design (ERADE). The treatment allocation proportions for these designs converge to the expected target values, which are functions of the Cox regression coefficients that are estimated sequentially. These expected target values are derived based on constrained optimization problems and are updated as information accrues with sequential arrival of patients. The design based on the link function is derived using the distribution function of a probit model whose parameters are adjusted based on the covariate profile of the incoming patient. To apply such designs, the treatment allocation probabilities are sequentially modified based on the treatment allocation history, response history, previous patients’ covariates and also the covariates of the incoming patient. Given these information, an expression is obtained for the conditional probability of a patient allocation to a treatment arm. Based on simulation studies, it is found that the ERADE is preferable to the DBCD when the main aim is to minimize the variance of the observed allocation proportion and to maximize the power of the Wald test for a treatment difference. However, the former procedure being discrete tends to be slower in converging towards the expected target allocation proportion. The link function based design achieves the highest skewness of patient allocation to the best treatment arm and thus ethically is the best design. Other comparative merits of the proposed designs have been highlighted and their preferred areas of application are discussed. It is concluded that the proposed CARA designs can be considered as suitable alternatives to the traditional balanced randomization designs in survival trials in terms of the power of the Wald test, provided that response data are available during the recruitment phase of the trial to enable adaptations to the designs. Moreover, the proposed designs enable more patients to get treated with the better treatment during the trial thus making the designs more ethically attractive to the patients. An existing clinical trial has been redesigned using these methods.

Keywords: censored response, Cox regression, efficiency, ethics, optimal allocation, power, variability

Procedia PDF Downloads 165
18326 Influence of Machining Process on Surface Integrity of Plasma Coating

Authors: T. Zlámal, J. Petrů, M. Pagáč, P. Krajkovič

Abstract:

For the required function of components with the thermal spray coating, it is necessary to perform additional machining of the coated surface. The paper deals with assessing the surface integrity of Metco 2042, a plasma sprayed coating, after its machining. The selected plasma sprayed coating serves as an abradable sealing coating in a jet engine. Therefore, the spray and its surface must meet high quality and functional requirements. Plasma sprayed coatings are characterized by lamellar structure, which requires a special approach to their machining. Therefore, the experimental part involves the set-up of special cutting tools and cutting parameters under which the applied coating was machined. For the assessment of suitably set machining parameters, selected parameters of surface integrity were measured and evaluated during the experiment. To determine the size of surface irregularities and the effect of the selected machining technology on the sprayed coating surface, the surface roughness parameters Ra and Rz were measured. Furthermore, the measurement of sprayed coating surface hardness by the HR 15 Y method before and after machining process was used to determine the surface strengthening. The changes of strengthening were detected after the machining. The impact of chosen cutting parameters on the surface roughness after the machining was not proven.

Keywords: machining, plasma sprayed coating, surface integrity, strengthening

Procedia PDF Downloads 266
18325 Investigation of Delamination Process in Adhesively Bonded Hardwood Elements under Changing Environmental Conditions

Authors: M. M. Hassani, S. Ammann, F. K. Wittel, P. Niemz, H. J. Herrmann

Abstract:

Application of engineered wood, especially in the form of glued-laminated timbers has increased significantly. Recent progress in plywood made of high strength and high stiffness hardwoods, like European beech, gives designers in general more freedom by increased dimensional stability and load-bearing capacity. However, the strong hygric dependence of basically all mechanical properties renders many innovative ideas futile. The tendency of hardwood for higher moisture sorption and swelling coefficients lead to significant residual stresses in glued-laminated configurations, cross-laminated patterns in particular. These stress fields cause initiation and evolution of cracks in the bond-lines resulting in: interfacial de-bonding, loss of structural integrity, and reduction of load-carrying capacity. Subsequently, delamination of glued-laminated timbers made of hardwood elements can be considered as the dominant failure mechanism in such composite elements. In addition, long-term creep and mechano-sorption under changing environmental conditions lead to loss of stiffness and can amplify delamination growth over the lifetime of a structure even after decades. In this study we investigate the delamination process of adhesively bonded hardwood (European beech) elements subjected to changing climatic conditions. To gain further insight into the long-term performance of adhesively bonded elements during the design phase of new products, the development and verification of an authentic moisture-dependent constitutive model for various species is of great significance. Since up to now, a comprehensive moisture-dependent rheological model comprising all possibly emerging deformation mechanisms was missing, a 3D orthotropic elasto-plastic, visco-elastic, mechano-sorptive material model for wood, with all material constants being defined as a function of moisture content, was developed. Apart from the solid wood adherends, adhesive layer also plays a crucial role in the generation and distribution of the interfacial stresses. Adhesive substance can be treated as a continuum layer constructed from finite elements, represented as a homogeneous and isotropic material. To obtain a realistic assessment on the mechanical performance of the adhesive layer and a detailed look at the interfacial stress distributions, a generic constitutive model including all potentially activated deformation modes, namely elastic, plastic, and visco-elastic creep was developed. We focused our studies on the three most common adhesive systems for structural timber engineering: one-component polyurethane adhesive (PUR), melamine-urea-formaldehyde (MUF), and phenol-resorcinol-formaldehyde (PRF). The corresponding numerical integration approaches, with additive decomposition of the total strain are implemented within the ABAQUS FEM environment by means of user subroutine UMAT. To predict the true stress state, we perform a history dependent sequential moisture-stress analysis using the developed material models for both wood substrate and adhesive layer. Prediction of the delamination process is founded on the fracture mechanical properties of the adhesive bond-line, measured under different levels of moisture content and application of the cohesive interface elements. Finally, we compare the numerical predictions with the experimental observations of de-bonding in glued-laminated samples under changing environmental conditions.

Keywords: engineered wood, adhesive, material model, FEM analysis, fracture mechanics, delamination

Procedia PDF Downloads 436
18324 Operational Excellence Performance in Pharmaceutical Quality Control Labs: An Empirical Investigation of the Effectiveness and Efficiency Relation

Authors: Stephan Koehler, Thomas Friedli

Abstract:

Performance measurement has evolved over time from a unidimensional short-term efficiency focused approach into a balanced multidimensional approach. Today, integrated performance measurement frameworks are often used to avoid local optimization and to encourage continuous improvement of an organization. In literature, the multidimensional characteristic of performance measurement is often described by competitive priorities. At the same time, on the highest abstraction level an effectiveness and efficiency dimension of performance measurement can be distinguished. This paper aims at a better understanding of the composition of effectiveness and efficiency and their relation in pharmaceutical quality control labs. The research comprises a lab-specific operationalization of effectiveness and efficiency and examines how the two dimensions are interlinked. The basis for the analysis represents a database of the University of St. Gallen including a divers set of 40 different pharmaceutical quality control labs. The research provides empirical evidence that labs with a high effectiveness also accompany a high efficiency. Lab effectiveness explains 29.5 % of the variance in lab efficiency. In addition, labs with an above median operational excellence performance have a statistically significantly higher lab effectiveness and lab efficiency compared to the below median performing labs.

Keywords: empirical study, operational excellence, performance measurement, pharmaceutical quality control lab

Procedia PDF Downloads 161
18323 Adaptation Measures as a Response to Climate Change Impacts and Associated Financial Implications for Construction Businesses by the Application of a Mixed Methods Approach

Authors: Luisa Kynast

Abstract:

It is obvious that buildings and infrastructure are highly impacted by climate change (CC). Both, design and material of buildings need to be resilient to weather events in order to shelter humans, animals, or goods. As well as buildings and infrastructure are exposed to weather events, the construction process itself is generally carried out outdoors without being protected from extreme temperatures, heavy rain, or storms. The production process is restricted by technical limitations for processing materials with machines and physical limitations due to human beings (“outdoor-worker”). In future due to CC, average weather patterns are expected to change as well as extreme weather events are expected to occur more frequently and more intense and therefore have a greater impact on production processes and on the construction businesses itself. This research aims to examine this impact by analyzing an association between responses to CC and financial performance of businesses within the construction industry. After having embedded the above depicted field of research into the resource dependency theory, a literature review was conducted to expound the state of research concerning a contingent relation between climate change adaptation measures (CCAM) and corporate financial performance for construction businesses. The examined studies prove that this field is rarely investigated, especially for construction businesses. Therefore, reports of the Carbon Disclosure Project (CDP) were analyzed by applying content analysis using the software tool MAXQDA. 58 construction companies – located worldwide – could be examined. To proceed even more systematically a coding scheme analogous to findings in literature was adopted. Out of qualitative analysis, data was quantified and a regression analysis containing corporate financial data was conducted. The results gained stress adaptation measures as a response to CC as a crucial proxy to handle climate change impacts (CCI) by mitigating risks and exploiting opportunities. In CDP reports the majority of answers stated increasing costs/expenses as a result of implemented measures. A link to sales/revenue was rarely drawn. Though, CCAM were connected to increasing sales/revenues. Nevertheless, this presumption is supported by the results of the regression analysis where a positive effect of implemented CCAM on construction businesses´ financial performance in the short-run was ascertained. These findings do refer to appropriate responses in terms of the implemented number of CCAM. Anyhow, still businesses show a reluctant attitude for implementing CCAM, which was confirmed by findings in literature as well as by findings in CDP reports. Businesses mainly associate CCAM with costs and expenses rather than with an effect on their corporate financial performance. Mostly companies underrate the effect of CCI and overrate the costs and expenditures for the implementation of CCAM and completely neglect the pay-off. Therefore, this research shall create a basis for bringing CC to the (financial) attention of corporate decision-makers, especially within the construction industry.

Keywords: climate change adaptation measures, construction businesses, financial implication, resource dependency theory

Procedia PDF Downloads 143
18322 Bioremediation of PAHs-Contaminated Soil Using Land Treatment Processes

Authors: Somaye Eskandary

Abstract:

Polycyclic aromatic hydrocarbons (PAHs) are present in crude oil and its derivatives contaminate soil and also increase carcinogen and mutagen contamination, which is a concern for researchers. Land farming is one of the methods that remove pollutants from the soil by native microorganisms. It seems that this technology is cost-effective, environmentally friendly and causes less debris problem to be disposed. This study aimed to refine the polycyclic aromatic hydrocarbons from oil-contaminated soil using the land farming method. In addition to examine the concentration of polycyclic aromatic hydrocarbons by GC-FID, some characteristics such as soil microbial respiration and dehydrogenase, peroxidase, urease, acid and alkaline phosphatase enzyme concentration were also measured. The results showed that after land farming process the concentrations of some polycyclic aromatic hydrocarbons dropped to 50 percent. The results showed that the enzyme concentration is reduced by reducing the concentration of hydrocarbons and microbial respiration. These results emphasize the process of land farming for removal of polycyclic aromatic hydrocarbons from soil by indigenous microorganisms.

Keywords: soil contamination, gas chromatography, native microorganisms, soil enzymes, microbial respiration, carcinogen

Procedia PDF Downloads 385
18321 Improved Hash Value Based Stream CipherUsing Delayed Feedback with Carry Shift Register

Authors: K. K. Soundra Pandian, Bhupendra Gupta

Abstract:

In the modern era, as the application data’s are massive and complex, it needs to be secured from the adversary attack. In this context, a non-recursive key based integrated spritz stream cipher with the circulant hash function using delayed feedback with carry shift register (d-FCSR) is proposed in this paper. The novelty of this proposed stream cipher algorithm is to engender the improved keystream using d-FCSR. The proposed algorithm is coded using Verilog HDL to produce dynamic binary key stream and implemented on commercially available FPGA device Virtex 5 xc5vlx110t-2ff1136. The implementation of stream cipher using d-FCSR on the FPGA device operates at a maximum frequency of 60.62 MHz. It achieved the data throughput of 492 Mbps and improved in terms of efficiency (throughput/area) compared to existing techniques. This paper also briefs the cryptanalysis of proposed circulant hash value based spritz stream cipher using d-FCSR is against the adversary attack on a hardware platform for the hardware based cryptography applications.

Keywords: cryptography, circulant function, field programmable gated array, hash value, spritz stream cipher

Procedia PDF Downloads 250
18320 The Potential of Kepulauan Seribu as Marine-Based Eco-Geotourism Site: The Study of Carbonate Platform as Geotourism Object in Kepulauan Seribu, Jakarta

Authors: Barry Majeed, Eka Febriana, Seto Julianto

Abstract:

Kepulauan Seribu National Parks is a marine preservation region in Indonesia. It is located in 5°23' - 5°40' LS, 106°25' - 106°37' BT North of Jakarta City. Covered with area 107,489 ha, Kepulauan Seribu has a lot of tourism spots such as cluster islands, fringing reef and many more. Kepulauan Seribu is also nominated as Strategic Tourism Region In Indonesia (KSPN). So, these islands have a lot of potential sides more than preservation function as a national park, hence the development of sustainable geotourism. The aim of this study is for enhancing the development of eco-geotourism in Kepulauan Seribu. This study concern for three main aspect of eco-geotourism such as tourism, form and process. Study for the tourism aspect includes attractions, accommodations, tours, activities, interpretation, and planning & management in Kepulauan Seribu. Study for the form aspect focused on the carbonate platform situated between two islands. Primarily in carbonate reef such as head coral, branchy coral, platy coral that created the carbonate sequence in Kepulauan Seribu. Study for the process aspect primarily discussed the process of forming of carbonate from carbonate factory later becomes Kepulauan Seribu. Study for the regional geology of Kepulauan Seribu has been conducted and suggested that Kepulauan Seribu lithologies are mainly quarternary limestone. In this study, primary data was taken from an observation of quarternary carbonate platform between two islands from Hati Island, Macan Island, Bulat Island, Ubi Island and Kelapa Island. From this observation, the best routes for tourist have been made from Island to Island. Qualitative methods such as depth interview to the local people in purposive sampling also have been made. Finally, this study also giving education about geological site – carbonate sequence - in Kepulauan Seribu for the local wisdom so that this study can support the development of sustainable eco-geotourism in Kepulauan Seribu.

Keywords: carbonate factory, carbonate platform, geotourism, Kepulauan Seribu

Procedia PDF Downloads 186
18319 Mechanical and Physical Properties of Aluminum Composite Reinforced with Carbon Nano Tube Dispersion via Ultrasonic and Ball Mill Attrition after Sever Plastic Deformation

Authors: Hassan Zare, Mohammad Jahedi, Mohammad Reza Toroghinejad, Mahmoud Meratian, Marko Knezevic

Abstract:

In this study, the carbon nanotube (CNT) reinforced Al matrix nanocomposites were fabricated by ECAP. Equal Channel Angular Pressing (ECAP) process is one of the most important methods for powder densification due to the presence of shear strain. This method samples with variety passes (one, two, four and eight passes) in C route were prepared at room temperature. A few study about metal matrix nanocomposite reinforced carbon nanotube done, the reaction intersection of interface and carbon nanotube cause to reduce the efficiency of nanocomposite. In this paper, we checked mechanical and physical properties of aluminum-CNT composite that manufactured by ECAP when the composite is deformed. The non-agglomerated CNTs were distributed homogeneously with 2% consolidation in the Aluminum matrix. The ECAP process was performed on the both monolithic and composite with distributed CNT samples for 8 passes.

Keywords: powder metallurgy, ball mill attrition, ultrasonic, consolidation

Procedia PDF Downloads 495
18318 Real-Time Mine Safety System with the Internet of Things

Authors: Şakir Bingöl, Bayram İslamoğlu, Ebubekir Furkan Tepeli, Fatih Mehmet Karakule, Fatih Küçük, Merve Sena Arpacık, Mustafa Taha Kabar, Muhammet Metin Molak, Osman Emre Turan, Ömer Faruk Yesir, Sıla İnanır

Abstract:

This study introduces an IoT-based real-time safety system for mining, addressing global safety challenges. The wearable device, seamlessly integrated into miners' jackets, employs LoRa technology for communication and offers real-time monitoring of vital health and environmental data. Unique features include an LCD panel for immediate information display and sound-based location tracking for emergency response. The methodology involves sensor integration, data transmission, and ethical testing. Validation confirms the system's effectiveness in diverse mining scenarios. The study calls for ongoing research to adapt the system to different mining contexts, emphasizing its potential to significantly enhance safety standards in the industry.

Keywords: mining safety, internet of things, wearable technology, LoRa, RFID tracking, real-time safety system, safety alerts, safety measures

Procedia PDF Downloads 63
18317 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: blueprint, ERP, modular, normalized

Procedia PDF Downloads 139
18316 A Comprehensive Evaluation of the Bus Rapid Transit Project from Gazipur to Airport at Dhaka Focusing on Environmental Impacts

Authors: Swapna Begum, Higano Yoshiro

Abstract:

Dhaka is the capital city of Bangladesh. It is considered as one of the traffic congested cities in the world. The growth of the population of this city is increasing day by day. The land use pattern and the increased socio-economic characteristics increase the motor vehicle ownership of this city. The rapid unplanned urbanization and poor transportation planning have deteriorated the transport environment of this city. Also, the huge travel demand with non-motorized traffics on streets is accounted for enormous traffic congestion in this city. The land transport sector in Dhaka is mainly dependent on road transport comprised of both motorized and non-motorized modes of travel. This improper modal mix and the un-integrated system have resulted in huge traffic congestion in this city. Moreover, this city has no well-organized public transport system and any Mass Transit System to cope with this ever increasing demand. Traffic congestion causes serious air pollution and adverse impact on the economy by deteriorating the accessibility, level of service, safety, comfort and operational efficiency. Therefore, there is an imperative need to introduce a well-organized, properly scheduled mass transit system like (Bus Rapid Transit) BRT minimizing the existing problems.

Keywords: air pollution, BRT, mass transit, traffic congestion

Procedia PDF Downloads 405