Search results for: Jeffrey J. Coleman
73 Effect of Homogeneous and Heterogeneous Chemical Reactions on Peristaltic Flow of a Jeffrey Fluid in an Asymmetric Channel
Authors: G. Ravi Kiran, G. Radhakrishnamacharya
Abstract:
In this paper, the dispersion of a solute in the peristaltic flow of a Jeffrey fluid in the presence of both homogeneous and heterogeneous chemical reactions has been discussed. The average effective dispersion coefficient has been found using Taylor's limiting condition under long wavelength approximation. It is observed that the average dispersion coefficient increases with amplitude ratio which implies that dispersion is more in the presence of peristalsis. The average effective dispersion coefficient increases with Jeffrey parameter in the cases of both homogeneous and combined homogeneous and heterogeneous chemical reactions. Further, dispersion decreases with a phase difference, homogeneous reaction rate parameters, and heterogeneous reaction rate parameter.Keywords: peristalsis, dispersion, chemical reaction, Jeffrey fluid, asymmetric channel
Procedia PDF Downloads 58772 Peristaltic Transport of a Jeffrey Fluid with Double-Diffusive Convection in Nanofluids in the Presence of Inclined Magnetic Field
Authors: Safia Akram
Abstract:
In this article, the effects of peristaltic transport with double-diffusive convection in nanofluids through an asymmetric channel with different waveforms is presented. Mathematical modelling for two-dimensional and two directional flows of a Jeffrey fluid model along with double-diffusive convection in nanofluids are given. Exact solutions are obtained for nanoparticle fraction field, concentration field, temperature field, stream functions, pressure gradient and pressure rise in terms of axial and transverse coordinates under the restrictions of long wavelength and low Reynolds number. With the help of computational and graphical results the effects of Brownian motion, thermophoresis, Dufour, Soret, and Grashof numbers (thermal, concentration, nanoparticles) on peristaltic flow patterns with double-diffusive convection are discussed.Keywords: nanofluid particles, peristaltic flow, Jeffrey fluid, magnetic field, asymmetric channel, different waveforms
Procedia PDF Downloads 38171 Brown-Spot Needle Blight: An Emerging Threat Causing Loblolly Pine Needle Defoliation in Alabama, USA
Authors: Debit Datta, Jeffrey J. Coleman, Scott A. Enebak, Lori G. Eckhardt
Abstract:
Loblolly pine (Pinus taeda) is a leading productive timber species in the southeastern USA. Over the past three years, an emerging threat is expressed by successive needle defoliation followed by stunted growth and tree mortality in loblolly pine plantations. Considering economic significance, it has now become a rising concern among landowners, forest managers, and forest health state cooperators. However, the symptoms of the disease were perplexed somewhat with root disease(s) and recurrently attributed to invasive Phytophthora species due to the similarity of disease nature and devastation. Therefore, the study investigated the potential causal agent of this disease and characterized the fungi associated with loblolly pine needle defoliation in the southeastern USA. Besides, 70 trees were selected at seven long-term monitoring plots at Chatom, Alabama, to monitor and record the annual disease incidence and severity. Based on colony morphology and ITS-rDNA sequence data, a total of 28 species of fungi representing 17 families have been recovered from diseased loblolly pine needles. The native brown-spot pathogen, Lecanosticta acicola, was the species most frequently recovered from unhealthy loblolly pine needles in combination with some other common needle cast and rust pathogen(s). Identification was confirmed using morphological similarity and amplification of translation elongation factor 1-alpha gene region of interest. Tagged trees were consistently found chlorotic and defoliated from 2019 to 2020. The current emergence of the brown-spot pathogen causing loblolly pine mortality necessitates the investigation of the role of changing climatic conditions, which might be associated with increased pathogen pressure to loblolly pines in the southeastern USA.Keywords: brown-spot needle blight, loblolly pine, needle defoliation, plantation forestry
Procedia PDF Downloads 15270 The Non-Linear Analysis of Brain Response to Visual Stimuli
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 56169 The Analysis of Brain Response to Auditory Stimuli through EEG Signals’ Non-Linear Analysis
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to auditory stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to auditory stimuli but provide us with very good recommendations for clinical purposes.Keywords: auditory stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 53468 An Emergence of Pinus taeda Needle Defoliation and Tree Mortality in Alabama, USA
Authors: Debit Datta, Jeffrey J. Coleman, Scott A. Enebak, Lori G. Eckhardt
Abstract:
Pinus taeda, commonly known as loblolly pine, is a crucial timber species native to the southeastern USA. An emerging problem has been encountered for the past few years, which is better to be known as loblolly pine needle defoliation (LPND), which is threatening the ecological health of southeastern forests and economic vitality of the region’s timber industry. Currently, more than 1000 hectares of loblolly plantations in Alabama are affected with similar symptoms and have created concern among southeast landowners and forest managers. However, it is still uncertain whether LPND results from one or the combination of several fungal pathogens. Therefore, the objectives of the study were to identify and characterize the fungi associated with LPND in the southeastern USA and document the damage being done to loblolly pine as a result of repeated defoliation. Identification of fungi was confirmed using classical morphological methods (microscopic examination of the infected needles), conventional and species-specific priming (SSPP) PCR, and ITS sequencing. To date, 17 species of fungi, either cultured from pine needles or formed fruiting bodies on pine needles, were identified based on morphology and genetic sequence data. Among them, brown-spot pathogen Lecanostica acicola has been frequently recovered from pine needles in both spring and summer. Moreover, Ophistomatoid fungi such as Leptographium procerum, L. terebrantis are associated with pine decline have also been recovered from root samples of the infected stands. Trees have been increasingly and repeatedly chlorotic and defoliated from 2019 to 2020. Based on morphological observations and molecular data, emerging loblolly pine needle defoliation is due in larger part to the brown-spot pathogen L. acoicola followed by pine decline pathogens L. procerum and L. terebrantis. Root pathogens were suspected to emerge later, and their cumulative effects contribute to the widespread mortality of the trees. It is more likely that longer wet spring and warmer temperatures are favorable to disease development and may be important in the disease ecology of LPND. Therefore, the outbreak of the disease is assumed to be expanded over a large geographical area in a changing climatic condition.Keywords: brown-spot fungi, emerging disease, defoliation, loblolly pine
Procedia PDF Downloads 13967 FRATSAN: A New Software for Fractal Analysis of Signals
Authors: Hamidreza Namazi
Abstract:
Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 46766 Free Will and Compatibilism in Decision Theory: A Solution to Newcomb’s Paradox
Authors: Sally Heyeon Hwang
Abstract:
Within decision theory, there are normative principles that dictate how one should act in addition to empirical theories of actual behavior. As a normative guide to one’s actual behavior, evidential or causal decision-theoretic equations allow one to identify outcomes with maximal utility values. The choice that each person makes, however, will, of course, differ according to varying assignments of weight and probability values. Regarding these different choices, it remains a subject of considerable philosophical controversy whether individual subjects have the capacity to exercise free will with respect to the assignment of probabilities, or whether instead the assignment is in some way constrained. A version of this question is given a precise form in Richard Jeffrey’s assumption that free will is necessary for Newcomb’s paradox to count as a decision problem. This paper will argue, against Jeffrey, that decision theory does not require the assumption of libertarian freedom. One of the hallmarks of decision-making is its application across a wide variety of contexts; the implications of a background assumption of free will is similarly varied. One constant across the contexts of decision is that there are always at least two levels of choice for a given agent, depending on the degree of prior constraint. Within the context of Newcomb’s problem, when the predictor is attempting to guess the choice the agent will make, he or she is analyzing the determined aspects of the agent such as past characteristics, experiences, and knowledge. On the other hand, as David Lewis’ backtracking argument concerning the relationship between past and present events brings to light, there are similarly varied ways in which the past can actually be dependent on the present. One implication of this argument is that even in deterministic settings, an agent can have more free will than it may seem. This paper will thus argue against the view that a stable background assumption of free will or determinism in decision theory is necessary, arguing instead for a compatibilist decision theory yielding a novel treatment of Newcomb’s problem.Keywords: decision theory, compatibilism, free will, Newcomb’s problem
Procedia PDF Downloads 32165 Upon Further Reflection: More on the History, Tripartite Role, and Challenges of the Professoriate
Authors: Jeffrey R. Mueller
Abstract:
This paper expands on the role of the professor by detailing the origins of the profession, adding some of the unique contributions of North American Universities, as well as some of the best practice recommendations, to the unique tripartite role of the professor. It describes current challenges to the profession including the ever-controversial student rating of professors. It continues with the significance of empowerment to the role of the professor. It concludes with a predictive prescription for the future of the professoriate and the role of the university-level educational administrator toward that end.Keywords: professoriate history, tripartite role, challenges, empowerment, shared governance, administratization
Procedia PDF Downloads 40164 Literary Words of Foreign Origin as Social Markers in Jeffrey Archer's Novels Speech Portrayals
Authors: Tatiana Ivushkina
Abstract:
The paper is aimed at studying the use of literary words of foreign origin in modern fiction from a sociolinguistic point of view, which presupposes establishing correlation between this category of words in a speech portrayal or narrative and a social status of the speaker, verifying that it bears social implications and serves as a social marker or index of socially privileged identity in the British literature of the 21-st century. To this end, there were selected literary words of foreign origin in context (60 contexts) and subjected to careful examination. The study is carried out on two novels by Jeffrey Archer – Not a Penny More, Not a Penny Less and A Prisoner of Birth – who, being a graduate from Oxford, represents socially privileged classes himself and gives a wide depiction of characters with different social backgrounds and statuses. The analysis of the novels enabled us to categorize the selected words into four relevant groups. The first represented by terms (commodity, debenture, recuperation, syringe, luminescence, umpire, etc.) serves to unambiguously indicate education, occupation, a field of knowledge in which a character is involved or a situation of communication. The second group is formed of words used in conjunction with their Germanic counterparts (perspiration – sweat, padre – priest, convivial – friendly) to contrast social position of the characters: literary words serving as social indices of upper class speakers whereas their synonyms of Germanic origin characterize middle or lower class speech portrayals. The third class of words comprises socially marked words (verbs, nouns, and adjectives), or U-words (the term first coined by Allan Ross and Nancy Mitford), the status acquired in the course of social history development (elegant, excellent, sophistication, authoritative, preposterous, etc.). The fourth includes words used in a humorous or ironic meaning to convey the narrator’s attitude to the characters or situation itself (ministrations, histrionic, etc.). Words of this group are perceived as 'alien', stylistically distant as they create incongruity between style and subject matter. Social implication of the selected words is enhanced by French words and phrases often accompanying them.Keywords: British literature of the XXI century, literary words of foreign origin, social context, social meaning
Procedia PDF Downloads 13463 A Cohort and Empirical Based Multivariate Mortality Model
Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong
Abstract:
This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management
Procedia PDF Downloads 5362 Actor Training in Social Work Education: A Pilot Study of Theatre Workshops to Enhance Clinical Empathy
Authors: Amanda Coleman, Estefanía Gonzalez
Abstract:
Empathy is considered an essential skill for engaging with social work clients. Drawing from developments in medical education, researchers will conduct and evaluate a three-part pilot theatre workshop with master level social work students (n ≈ 30) to evaluate the workshop's ability to enhance empathy among participants. Outcomes will be measured using semi-structured post-intervention interviews with a subset of participants (n ≈ 10) as well post-intervention written reflections and pre-and-post intervention quantitative evaluation of empathy using King and Holosko’s 2011 Empathy Scale for Social Workers. The content of the workshop will differ from traditional role plays, which are common in social work education, in that it will draw from role theory and research on creative empathy to emphasize role reversal with clients. Workshops will be held February and March of 2017 with preliminary findings available by April.Keywords: education, empathy, social work, theatre
Procedia PDF Downloads 27161 Spherical Harmonic Based Monostatic Anisotropic Point Scatterer Model for RADAR Applications
Authors: Eric Huang, Coleman DeLude, Justin Romberg, Saibal Mukhopadhyay, Madhavan Swaminathan
Abstract:
High performance computing (HPC) based emulators can be used to model the scattering from multiple stationary and moving targets for RADAR applications. These emulators rely on the RADAR Cross Section (RCS) of the targets being available in complex scenarios. Representing the RCS using tables generated from electromagnetic (EM) simulations is often times cumbersome leading to large storage requirement. This paper proposed a spherical harmonic based anisotropic scatterer model to represent the RCS of complex targets. The problem of finding the locations and reflection profiles of all scatterers can be formulated as a linear least square problem with a special sparsity constraint. This paper solves this problem using a modified Orthogonal Matching Pursuit algorithm. The results show that the spherical harmonic based scatterer model can effectively represent the RCS data of complex targets.Keywords: RADAR, RCS, high performance computing, point scatterer model
Procedia PDF Downloads 19160 Linear Dynamic Stability Analysis of a Continuous Rotor-Disk-Blades System
Authors: F. Rahimi Dehgolan, S. E. Khadem, S. Bab, M. Najafee
Abstract:
Nowadays, using rotating systems like shafts and disks in industrial machines have been increased constantly. Dynamic stability is one of the most important factors in designing rotating systems. In this study, linear frequencies and stability of a coupled continuous flexible rotor-disk-blades system are studied. The Euler-Bernoulli beam theory is utilized to model the blade and shaft. The equations of motion are extracted using the extended Hamilton principle. The equations of motion have been simplified using the Coleman and complex transformations method. The natural frequencies of the linear part of the system are extracted, and the effects of various system parameters on the natural frequencies and decay rates (stability condition) are clarified. It can be seen that the centrifugal stiffening effect applied to the blades is the most important parameter for stability of the considered rotating system. This result highlights the importance of considering this stiffing effect in blades equation.Keywords: rotating shaft, flexible blades, centrifugal stiffness, stability
Procedia PDF Downloads 26559 Decision-Making, Student Empathy, and Cold War Historical Events: A Case Study of Abstract Thinking through Content-Centered Learning
Authors: Jeffrey M. Byford
Abstract:
The conceptualized theory of decision making on historical events often does not conform to uniform beliefs among students. When presented the opportunity, many students have differing opinions and rationales associated with historical events and outcomes. The intent of this paper was to provide students with the economic, social and political dilemmas associated with the autonomy of East Berlin. Students ranked seven possible actions from the most to least acceptable. In addition, students were required to provide both positive and negative factors for each decision and relative ranking. Results from this activity suggested that while most students chose a financial action towards West Berlin, some students had trouble justifying their actions.Keywords: content-centered learning, cold war, Berlin, decision-making
Procedia PDF Downloads 45558 Classification of Health Risk Factors to Predict the Risk of Falling in Older Adults
Authors: L. Lindsay, S. A. Coleman, D. Kerr, B. J. Taylor, A. Moorhead
Abstract:
Cognitive decline and frailty is apparent in older adults leading to an increased likelihood of the risk of falling. Currently health care professionals have to make professional decisions regarding such risks, and hence make difficult decisions regarding the future welfare of the ageing population. This study uses health data from The Irish Longitudinal Study on Ageing (TILDA), focusing on adults over the age of 50 years, in order to analyse health risk factors and predict the likelihood of falls. This prediction is based on the use of machine learning algorithms whereby health risk factors are used as inputs to predict the likelihood of falling. Initial results show that health risk factors such as long-term health issues contribute to the number of falls. The identification of such health risk factors has the potential to inform health and social care professionals, older people and their family members in order to mitigate daily living risks.Keywords: classification, falls, health risk factors, machine learning, older adults
Procedia PDF Downloads 14857 'Violence Is Bad, but It's Just a Game': The Glorification of Violence from Roman Antiquity to Popular Culture
Authors: M. C. Steyn
Abstract:
Violence and entertainment are not mutually exclusive subjects in the Ancient Roman world, in reality they are closely knit together. Their world is permeated by repeated and continuous episodes of violence in its many manifestations, both sanctioned and spontaneous, most of which is considered as some form of entertainment, from plays and writings through the spectrum to the gladiatorial arena. In the 21st century this socio-psychological dynamic is manifested through the stage provided by the screen and what we watch in terms of TV, movies and games. This glorification of violence in a modern world is not out of place as seen in contemporary post apocalyptical/ dystopian literature, film and computer games where the act of violence, frowned upon by social norms and values, becomes sanctioned by the (un)real nature of the game: ‘I am not a violent person, violence is bad, this is just a game’. This paper will examine how violence is framed in the Ancient World and subsequently how it is received by popular culture to represent a world in which the maintenance of stability can only be achieved through officially sanctioned violence, whether sanctioned by the State or the gaming community. This argument will examine both ancient and modern critics of violence such as Senecca, Coleman and Foucault and framed by Baudrillard’s commentary on the post-modern conceptualization of reality.Keywords: entertainment, violence, gladiatorial games, gaming
Procedia PDF Downloads 49056 Supersymmetry versus Compositeness: 2-Higgs Doublet Models Tell the Story
Authors: S. De Curtis, L. Delle Rose, S. Moretti, K. Yagyu
Abstract:
Supersymmetry and compositeness are the two prevalent paradigms providing both a solution to the hierarchy problem and motivation for a light Higgs boson state. An open door towards the solution is found in the context of 2-Higgs Doublet Models (2HDMs), which are necessary to supersymmetry and natural within compositeness in order to enable Electro-Weak Symmetry Breaking. In scenarios of compositeness, the two isospin doublets arise as pseudo Nambu-Goldstone bosons from the breaking of SO(6). By calculating the Higgs potential at one-loop level through the Coleman-Weinberg mechanism from the explicit breaking of the global symmetry induced by the partial compositeness of fermions and gauge bosons, we derive the phenomenological properties of the Higgs states and highlight the main signatures of this Composite 2-Higgs Doublet Model at the Large Hadron Collider. These include modifications to the SM-like Higgs couplings as well as production and decay channels of heavier Higgs bosons. We contrast the properties of this composite scenario to the well-known ones established in supersymmetry, with the MSSM being the most notorious example. We show how 2HDM spectra of masses and couplings accessible at the Large Hadron Collider may allow one to distinguish between the two paradigms.Keywords: beyond the standard model, composite Higgs, supersymmetry, Two-Higgs Doublet Model
Procedia PDF Downloads 12655 Leveraging Li-Fi to Enhance Security and Performance of Medical Devices
Authors: Trevor Kroeger, Hayden Williams, Edward Holzinger, David Coleman, Brian Haberman
Abstract:
The network connectivity of medical devices is increasing at a rapid rate. Many medical devices, such as vital sign monitors, share information via wireless or wired connections. However, these connectivity options suffer from a variety of well-known limitations. Wireless connectivity, especially in the unlicensed radio frequency bands, can be disrupted. Such disruption could be due to benign reasons, such as a crowded spectrum, or to malicious intent. While wired connections are less susceptible to interference, they inhibit the mobility of the medical devices, which could be critical in a variety of scenarios. This work explores the application of Light Fidelity (Li-Fi) communication to enhance the security, performance, and mobility of medical devices in connected healthcare scenarios. A simple bridge for connected devices serves as an avenue to connect traditional medical devices to the Li-Fi network. This bridge was utilized to conduct bandwidth tests on a small Li-Fi network installed into a Mock-ICU setting with a backend enterprise network similar to that of a hospital. Mobile and stationary tests were conducted to replicate various different situations that might occur within a hospital setting. Results show that in room Li-Fi connectivity provides reasonable bandwidth and latency within a hospital like setting.Keywords: hospital, light fidelity, Li-Fi, medical devices, security
Procedia PDF Downloads 10254 Computational Analysis of Potential Inhibitors Selected Based on Structural Similarity for the Src SH2 Domain
Authors: W. P. Hu, J. V. Kumar, Jeffrey J. P. Tsai
Abstract:
The inhibition of SH2 domain regulated protein-protein interactions is an attractive target for developing an effective chemotherapeutic approach in the treatment of disease. Molecular simulation is a useful tool for developing new drugs and for studying molecular recognition. In this study, we searched potential drug compounds for the inhibition of SH2 domain by performing structural similarity search in PubChem Compound Database. A total of 37 compounds were screened from the database, and then we used the LibDock docking program to evaluate the inhibition effect. The best three compounds (AP22408, CID 71463546 and CID 9917321) were chosen for MD simulations after the LibDock docking. Our results show that the compound CID 9917321 can produce a more stable protein-ligand complex compared to other two currently known inhibitors of Src SH2 domain. The compound CID 9917321 may be useful for the inhibition of SH2 domain based on these computational results. Subsequently experiments are needed to verify the effect of compound CID 9917321 on the SH2 domain in the future studies.Keywords: nonpeptide inhibitor, Src SH2 domain, LibDock, molecular dynamics simulation
Procedia PDF Downloads 26953 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks
Authors: Ather Saeed, Arif Khan, Jeffrey Gosper
Abstract:
Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering
Procedia PDF Downloads 7552 Photocatalytic Packed‐Bed Flow Reactor for Continuous Room‐Temperature Hydrogen Release from Liquid Organic Carriers
Authors: Malek Y. S. Ibrahim, Jeffrey A. Bennett, Milad Abolhasani
Abstract:
Despite the potential of hydrogen (H2) storage in liquid organic carriers to achieve carbon neutrality, the energy required for H2 release and the cost of catalyst recycling has hindered its large-scale adoption. In response, a photo flow reactor packed with rhodium (Rh)/titania (TiO2) photocatalyst was reported for the continuous and selective acceptorless dehydrogenation of 1,2,3,4-tetrahydroquinoline to H2 gas and quinoline under visible light irradiation at room temperature. The tradeoff between the reactor pressure drop and its photocatalytic surface area was resolved by selective in-situ photodeposition of Rh in the photo flow reactor post-packing on the outer surface of the TiO2 microparticles available to photon flux, thereby reducing the optimal Rh loading by 10 times compared to a batch reactor, while facilitating catalyst reuse and regeneration. An example of using quinoline as a hydrogen acceptor to lower the energy of the hydrogen production step was demonstrated via the water-gas shift reaction.Keywords: hydrogen storage, flow chemistry, photocatalysis, solar hydrogen
Procedia PDF Downloads 9851 Experimental Investigation of Proton Exchange Membrane Fuel Cells Operated with Nano Fiber and Nano Fiber/Nano Particle
Authors: Kevser Dincer, Basma Waisi, M. Ozan Ozdemir, Ugur Pasaogullari, Jeffrey McCutcheon
Abstract:
Nanofibers are defined as fibers with diameters less than 100 nanometers. They can be produced by interfacial polymerization, electrospinning and electrostatic spinning. In this study, behaviours of activated carbon nano fiber (ACNF), carbon nano-fiber (CNF), Polyacrylonitrile/carbon nanotube (PAN/CNT), Polyvinyl alcohol/nano silver (PVA/Ag) in PEM fuel cells are investigated experimentally. This material was used as gas diffusion layer (GDL) in PEM fuel cells. When the performances of these cells are compared to each other at 5x5 cm2 cell, it is found that the PVA/Ag exhibits the best performance among all. In this work, nano fiber and nano fiber/nano particles electrical conductivities have been studied to understand their effects on PEM fuel cell performance. According to the experimental results, the maximum electrical conductivity performance of the fuel cell with nanofiber was found to be at PVA/Ag. The electrical conductivities of CNF, ACNF, PAN/CNT are lower for PEM. The resistance of cell with PVA/Ag is lower than the resistance of cell with PAN/CNT, ACNF, CNF.Keywords: proton exchange membrane fuel cells, electrospinning, carbon nano fiber, activate carbon nano-fiber, PVA fiber, PAN fiber, carbon nanotube, nano particle nanocomposites
Procedia PDF Downloads 39150 Effect of Depth on Texture Features of Ultrasound Images
Authors: M. A. Alqahtani, D. P. Coleman, N. D. Pugh, L. D. M. Nokes
Abstract:
In diagnostic ultrasound, the echo graphic B-scan texture is an important area of investigation since it can be analyzed to characterize the histological state of internal tissues. An important factor requiring consideration when evaluating ultrasonic tissue texture is the depth. The effect of attenuation with depth of ultrasound, the size of the region of interest, gain, and dynamic range are important variables to consider as they can influence the analysis of texture features. These sources of variability have to be considered carefully when evaluating image texture as different settings might influence the resultant image. The aim of this study is to investigate the effect of depth on the texture features in-vivo using a 3D ultrasound probe. The left leg medial head of the gastrocnemius muscle of 10 healthy subjects were scanned. Two regions A and B were defined at different depth within the gastrocnemius muscle boundary. The size of both ROI’s was 280*20 pixels and the distance between region A and B was kept constant at 5 mm. Texture parameters include gray level, variance, skewness, kurtosis, co-occurrence matrix; run length matrix, gradient, autoregressive (AR) model and wavelet transform were extracted from the images. The paired t –test was used to test the depth effect for the normally distributed data and the Wilcoxon–Mann-Whitney test was used for the non-normally distributed data. The gray level, variance, and run length matrix were significantly lowered when the depth increased. The other texture parameters showed similar values at different depth. All the texture parameters showed no significant difference between depths A and B (p > 0.05) except for gray level, variance and run length matrix (p < 0.05). This indicates that gray level, variance, and run length matrix are depth dependent.Keywords: ultrasound image, texture parameters, computational biology, biomedical engineering
Procedia PDF Downloads 29549 Urban Design and Social Capital in Spontaneous Settlements
Abstract:
Rapid urbanization have made of spontaneous settlements one of the dominant´s social subjects of the XXIst century. Currently, it´s recognized that these territories cannot easily be eradicated and are a way of life to many populations of emergent countries. Since late 90s, there is an urgent concern in finding planning and efficient urban design strategies to poverty reduction, spatial integration and social inclusion of low-income communities. The article aims to identify, understand and evaluate the social inclusion´s processes through the urban transformation that has been undertaken in Moravia and how they affected the community´s social capital. To achieve this objective, we start to analyse the PPMIM´s planning discourse in which prevails the sustainability´s concept, to further identify, through the analysis of the project carried out, the urban design strategies implemented and their impact on the perception and on the community´s experience, and, finally, how these focused on the social capital. It relies on concepts such as urban design, social capital, local development and sustainability. At the urban design level it starts on the current principles of “making places”, on the new urbanism concepts and on the practices on the ground carried out by a new generation of architects/planners whose have the main ethical approach in order to create more opportunities and greater social impact to these territories. At the social capital´s level and on the development´s theory, relies on authors such as Coleman, Putman Kliksberg and Amartya Sen. Finally, it aims to address a general discussion about the positive and negative implications of slum upgrading programmes and some necessary recommendations for urban design and social capital can really be translated into real resources for the self sustainable development of low-income communities and their future generations.Keywords: local and sustainable development, social capital, spontaneous settlements, urban design
Procedia PDF Downloads 49148 Comparing the Detection of Autism Spectrum Disorder within Males and Females Using Machine Learning Techniques
Authors: Joseph Wolff, Jeffrey Eilbott
Abstract:
Autism Spectrum Disorders (ASD) are a spectrum of social disorders characterized by deficits in social communication, verbal ability, and interaction that can vary in severity. In recent years, researchers have used magnetic resonance imaging (MRI) to help detect how neural patterns in individuals with ASD differ from those of neurotypical (NT) controls for classification purposes. This study analyzed the classification of ASD within males and females using functional MRI data. Functional connectivity (FC) correlations among brain regions were used as feature inputs for machine learning algorithms. Analysis was performed on 558 cases from the Autism Brain Imaging Data Exchange (ABIDE) I dataset. When trained specifically on females, the algorithm underperformed in classifying the ASD subset of our testing population. Although the subject size was relatively smaller in the female group, the manual matching of both male and female training groups helps explain the algorithm’s bias, indicating the altered sex abnormalities in functional brain networks compared to typically developing peers. These results highlight the importance of taking sex into account when considering how generalizations of findings on males with ASD apply to females.Keywords: autism spectrum disorder, machine learning, neuroimaging, sex differences
Procedia PDF Downloads 20947 Automatic Adjustment of Thresholds via Closed-Loop Feedback Mechanism for Solder Paste Inspection
Authors: Chia-Chen Wei, Pack Hsieh, Jeffrey Chen
Abstract:
Surface Mount Technology (SMT) is widely used in the area of the electronic assembly in which the electronic components are mounted to the surface of the printed circuit board (PCB). Most of the defects in the SMT process are mainly related to the quality of solder paste printing. These defects lead to considerable manufacturing costs in the electronics assembly industry. Therefore, the solder paste inspection (SPI) machine for controlling and monitoring the amount of solder paste printing has become an important part of the production process. So far, the setting of the SPI threshold is based on statistical analysis and experts’ experiences to determine the appropriate threshold settings. Because the production data are not normal distribution and there are various variations in the production processes, defects related to solder paste printing still occur. In order to solve this problem, this paper proposes an online machine learning algorithm, called the automatic threshold adjustment (ATA) algorithm, and closed-loop architecture in the SMT process to determine the best threshold settings. Simulation experiments prove that our proposed threshold settings improve the accuracy from 99.85% to 100%.Keywords: big data analytics, Industry 4.0, SPI threshold setting, surface mount technology
Procedia PDF Downloads 11646 Cosmic Muon Tomography at the Wylfa Reactor Site Using an Anti-Neutrino Detector
Authors: Ronald Collins, Jonathon Coleman, Joel Dasari, George Holt, Carl Metelko, Matthew Murdoch, Alexander Morgan, Yan-Jie Schnellbach, Robert Mills, Gareth Edwards, Alexander Roberts
Abstract:
At the Wylfa Magnox Power Plant between 2014–2016, the VIDARR prototype anti-neutrino detector was deployed. It is comprised of extruded plastic scintillating bars measuring 4 cm × 1 cm × 152 cm and utilised wavelength shifting fibres (WLS) and multi-pixel photon counters (MPPCs) to detect and quantify radiation. During deployment, it took cosmic muon data in accidental coincidence with the anti-neutrino measurements with the power plant site buildings obscuring the muon sky. Cosmic muons have a significantly higher probability of being attenuated and/or absorbed by denser objects, and so one-sided cosmic muon tomography was utilised to image the reactor site buildings. In order to achieve clear building outlines, a control data set was taken at the University of Liverpool from 2016 – 2018, which had minimal occlusion of the cosmic muon flux by dense objects. By taking the ratio of these two data sets and using GEANT4 simulations, it is possible to perform a one-sided cosmic muon tomography analysis. This analysis can be used to discern specific buildings, building heights, and features at the Wylfa reactor site, including the reactor core/reactor core shielding using ∼ 3 hours worth of cosmic-ray detector live time. This result demonstrates the feasibility of using cosmic muon analysis to determine a segmented detector’s location with respect to surrounding buildings, assisted by aerial photography or satellite imagery.Keywords: anti-neutrino, GEANT4, muon, tomography, occlusion
Procedia PDF Downloads 18645 Design and Evaluation of Oven Type Furnace Using Earth Materials for Roasting Foods
Authors: Jeffrey Cacho, Sherwin Reyes
Abstract:
The research targeted enhancing energy utilization and reducing waste in roasting processes, particularly in Camarines Norte, where Bounty Agro Ventures Incorporated dominates through brands such as Chooks-to-Go, Uling Roaster, and Reyal. Competitors like Andok’s and Baliwag Lechon Manok also share the market. A staggering 90% of these businesses use traditional glass-type roasting furnaces fueled by wood charcoal, leading to significant energy loss and inefficiency due to suboptimal heat conservation. Only a mere 10% employ electric ovens. Many available furnaces, typically constructed from industrial materials through welding and other metal joining techniques, are not energy-efficient. Cost-prohibitive commercial options compel some micro-enterprises to fabricate their furnaces. The study proposed developing an eco-friendly, cost-effective roasting furnace with excellent heat retention. The distinct design aimed to reduce cooks' heat exposure and overall fuel consumption. The furnace features an angle bar frame, a combustion chute for fuel burning, a heat-retaining clay-walled chamber, and a top cover, all contributing to improved energy savings and user safety.Keywords: biomass roasting furnace, heat storage, combustion chute, start-up roasting business
Procedia PDF Downloads 5344 An Investigation of the Barriers to E-Business Implementation in Small and Medium-Sized Enterprises
Authors: Jeffrey Chang, Barun Dasgupta
Abstract:
E-business technologies, whereby business transactions are conducted remotely using the Internet, present unique opportunities and challenges for business. E-business technologies are applicable to a wide range of organizations and small and medium-sized enterprises (SMEs) are no exception. There is an established body of literature about e-business, looking at definitions, concepts, benefits and challenges. In general, however, the research focus has been on larger organizations, not SMEs. In an attempt to redress the balance of research, this paper looks at e-business technologies specifically from a small business perspective. It seeks to identify the possible barriers that SMEs might face when considering adoption of the e-business concept and practice as part of their business process change initiatives and implementation. To facilitate analysis of these barriers a conceptual framework has been developed which outlines the key conceptual and practical challenges of e-business implementation in SMEs. This is developed following a literature survey comprised of three categories: characteristics of SMEs, issues of IS/IT use in SMEs and general e-business adoption and implementation issues. The framework is then empirically assessed against 7 SMEs who have yet to implement e-business or whose e-business efforts have been unsatisfactory. Conclusions from the case studies can be used to verify the framework, and set parameters for further larger scale empirical investigation.Keywords: business process change, disruptive technologies, electronic business (e-Business), electronic commerce (e-Commerce), ICT adoption, small and medium sized enterprises (SMEs)
Procedia PDF Downloads 538