Search results for: explosion proof
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 501

Search results for: explosion proof

141 Combustion Characteristics of Ionized Fuels for Battery System Safety

Authors: Hyeuk Ju Ko, Eui Ju Lee

Abstract:

Many electronic devices are powered by various rechargeable batteries such as lithium-ion today, but occasionally the batteries undergo thermal runaway and cause fire, explosion, and other hazards. If a battery fire should occur in an electronic device of vehicle and aircraft cabin, it is important to quickly extinguish the fire and cool the batteries to minimize safety risks. Attempts to minimize these risks have been carried out by many researchers but the number of study on the successful extinguishment is limited. Because most rechargeable batteries are operated on the ion state with electron during charge and discharge of electricity, and the reaction of this electrolyte has a big difference with normal combustion. Here, we focused on the effect of ions on reaction stability and pollutant emissions during combustion process. The other importance for understanding ionized fuel combustion could be found in high efficient and environment-friendly combustion technologies, which are used to be operated an extreme condition and hence results in unintended flame instability such as extinction and oscillation. The use of electromagnetic energy and non-equilibrium plasma is one of the way to solve the problems, but the application has been still limited because of lack of excited ion effects in the combustion process. Therefore, the understanding of ion role during combustion might be promised to the energy safety society including the battery safety. In this study, the effects of an ionized fuel on the flame stability and pollutant emissions were experimentally investigated in the hydrocarbon jet diffusion flames. The burner used in this experiment consisted of 7.5 mm diameter tube for fuel and the gaseous fuels were ionized with the ionizer (SUNJE, SPN-11). Methane (99.9% purity) and propane (commercial grade) were used as a fuel and open ambient air was used as an oxidizer. As the performance of ionizer used in the experiment was evaluated at first, ion densities of both propane and methane increased linearly with volume flow rate but the ion density of propane is slightly higher than that of methane. The results show that the overall flame stability and shape such as flame length has no significant difference even in the higher ion concentration. However, the fuel ionization affects to the pollutant emissions such as NOx and soot. NOx and CO emissions measured in post flame region decreased with increasing fuel ionization, especially at high fuel velocity, i.e. high ion density. TGA analysis and morphology of soot by TEM indicates that the fuel ionization makes soot to be matured.

Keywords: battery fires, ionization, jet flames, stability, NOx and soot

Procedia PDF Downloads 182
140 Role of Microplastics on Reducing Heavy Metal Pollution from Wastewater

Authors: Derin Ureten

Abstract:

Plastic pollution does not disappear, it gets smaller and smaller through photolysis which are caused mainly by sun’s radiation, thermal oxidation, thermal degradation, and biodegradation which is the action of organisms digesting larger plastics. All plastic pollutants have exceedingly harmful effects on the environment. Together with the COVID-19 pandemic, the number of plastic products such as masks and gloves flowing into the environment has increased more than ever. However, microplastics are not the only pollutants in water, one of the most tenacious and toxic pollutants are heavy metals. Heavy metal solutions are also capable of causing varieties of health problems in organisms such as cancer, organ damage, nervous system damage, and even death. The aim of this research is to prove that microplastics can be used in wastewater treatment systems by proving that they could adsorb heavy metals in solutions. Experiment for this research will include two heavy metal solutions; one including microplastics in a heavy metal contaminated water solution, and one that just includes heavy metal solution. After being sieved, absorbance of both mediums will be measured with the help of a spectrometer. Iron (III) chloride (FeCl3) will be used as the heavy metal solution since the solution becomes darker as the presence of this substance increases. The experiment will be supported by Pure Nile Red powder in order to observe if there are any visible differences under the microscope. Pure Nile Red powder is a chemical that binds to hydrophobic materials such as plastics and lipids. If proof of adsorbance could be observed by the rates of the solutions' final absorbance rates and visuals ensured by the Pure Nile Red powder, the experiment will be conducted with different temperature levels in order to analyze the most accurate temperature level to proceed with removal of heavy metals from water. New wastewater treatment systems could be generated with the help of microplastics, for water contaminated with heavy metals.

Keywords: microplastics, heavy metal, pollution, adsorbance, wastewater treatment

Procedia PDF Downloads 83
139 AI Peer Review Challenge: Standard Model of Physics vs 4D GEM EOS

Authors: David A. Harness

Abstract:

Natural evolution of ATP cognitive systems is to meet AI peer review standards. ATP process of axiom selection from Mizar to prove a conjecture would be further refined, as in all human and machine learning, by solving the real world problem of the proposed AI peer review challenge: Determine which conjecture forms the higher confidence level constructive proof between Standard Model of Physics SU(n) lattice gauge group operation vs. present non-standard 4D GEM EOS SU(n) lattice gauge group spatially extended operation in which the photon and electron are the first two trace angular momentum invariants of a gravitoelectromagnetic (GEM) energy momentum density tensor wavetrain integration spin-stress pressure-volume equation of state (EOS), initiated via 32 lines of Mathematica code. Resulting gravitoelectromagnetic spectrum ranges from compressive through rarefactive of the central cosmological constant vacuum energy density in units of pascals. Said self-adjoint group operation exclusively operates on the stress energy momentum tensor of the Einstein field equations, introducing quantization directly on the 4D spacetime level, essentially reformulating the Yang-Mills virtual superpositioned particle compounded lattice gauge groups quantization of the vacuum—into a single hyper-complex multi-valued GEM U(1) × SU(1,3) lattice gauge group Planck spacetime mesh quantization of the vacuum. Thus the Mizar corpus already contains all of the axioms required for relevant DeepMath premise selection and unambiguous formal natural language parsing in context deep learning.

Keywords: automated theorem proving, constructive quantum field theory, information theory, neural networks

Procedia PDF Downloads 176
138 Research Trends in Using Virtual Reality for the Analysis and Treatment of Lower-Limb Musculoskeletal Injury of Athletes: A Literature Review

Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes

Abstract:

There is little research applying virtual reality (VR) to the treatment of musculoskeletal injury in athletes. This is despite their prevalence, and the implications for physical and psychological health. Nevertheless, developments of wireless VR headsets better facilitate dynamic movement in VR environments (VREs), and more research is expected in this emerging field. This systematic review identified publications that used VR interventions for the analysis or treatment of lower-limb musculoskeletal injury of athletes. It established a search protocol, and through narrative discussion, identified existing trends. Database searches encompassed four term sets: 1) VR systems; 2) musculoskeletal injuries; 3) sporting population; 4) movement outcome analysis. Overall, a total of 126 publications were identified through database searching, and twelve were included in the final analysis and discussion. Many of the studies were pilot and proof of concept work. Seven of the twelve publications were observational studies. However, this may provide preliminary data from which clinical trials will branch. If specified, the focus of the literature was very narrow, with very similar population demographics and injuries. The trends in the literature findings emphasised the role of VR and attentional focus, the strategic manipulation of movement outcomes, and the transfer of skill to the real-world. Causal inferences may have been undermined by flaws, as most studies were limited by the practicality of conducting a two-factor clinical-VR-based study. In conclusion, by assessing the exploratory studies, and combining this with the use of numerous developments, techniques, and tools, a novel application could be established to utilise VR with dynamic movement, for the effective treatment of specific musculoskeletal injuries of athletes.

Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality

Procedia PDF Downloads 227
137 Comparison of Spiking Neuron Models in Terms of Biological Neuron Behaviours

Authors: Fikret Yalcinkaya, Hamza Unsal

Abstract:

To understand how neurons work, it is required to combine experimental studies on neural science with numerical simulations of neuron models in a computer environment. In this regard, the simplicity and applicability of spiking neuron modeling functions have been of great interest in computational neuron science and numerical neuroscience in recent years. Spiking neuron models can be classified by exhibiting various neuronal behaviors, such as spiking and bursting. These classifications are important for researchers working on theoretical neuroscience. In this paper, three different spiking neuron models; Izhikevich, Adaptive Exponential Integrate Fire (AEIF) and Hindmarsh Rose (HR), which are based on first order differential equations, are discussed and compared. First, the physical meanings, derivatives, and differential equations of each model are provided and simulated in the Matlab environment. Then, by selecting appropriate parameters, the models were visually examined in the Matlab environment and it was aimed to demonstrate which model can simulate well-known biological neuron behaviours such as Tonic Spiking, Tonic Bursting, Mixed Mode Firing, Spike Frequency Adaptation, Resonator and Integrator. As a result, the Izhikevich model has been shown to perform Regular Spiking, Continuous Explosion, Intrinsically Bursting, Thalmo Cortical, Low-Threshold Spiking and Resonator. The Adaptive Exponential Integrate Fire model has been able to produce firing patterns such as Regular Ignition, Adaptive Ignition, Initially Explosive Ignition, Regular Explosive Ignition, Delayed Ignition, Delayed Regular Explosive Ignition, Temporary Ignition and Irregular Ignition. The Hindmarsh Rose model showed three different dynamic neuron behaviours; Spike, Burst and Chaotic. From these results, the Izhikevich cell model may be preferred due to its ability to reflect the true behavior of the nerve cell, the ability to produce different types of spikes, and the suitability for use in larger scale brain models. The most important reason for choosing the Adaptive Exponential Integrate Fire model is that it can create rich ignition patterns with fewer parameters. The chaotic behaviours of the Hindmarsh Rose neuron model, like some chaotic systems, is thought to be used in many scientific and engineering applications such as physics, secure communication and signal processing.

Keywords: Izhikevich, adaptive exponential integrate fire, Hindmarsh Rose, biological neuron behaviours, spiking neuron models

Procedia PDF Downloads 177
136 Identification of Body Fluid at the Crime Scene by DNA Methylation Markers for Use in Forensic Science

Authors: Shirin jalili, Hadi Shirzad, Mahasti Modarresi, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Identifying the source tissue of biological material found at crime scenes can be very informative in a number of cases. Despite their usefulness, current visual, catalytic, enzymatic, and immunologic tests for presumptive and confirmatory tissue identification are applicable only to a subset of samples, might suffer limitations such as low specificity, lack of sensitivity, and are substantially impacted by environmental insults. In addition their results are operator-dependent. Recently the possibility of discriminating body fluids using mRNA expression differences in tissues has been described but lack of long term stability of that Molecule and the need to normalize samples for each individual are limiting factors. The use of DNA should solve these issues because of its long term stability and specificity to each body fluid. Cells in the human body have a unique epigenome, which includes differences in DNA methylation in the promoter of genes. DNA methylation, which occurs at the 5′-position of the cytosine in CpG dinucleotides, has great potential for forensic identification of body fluids, because tissue-specific patterns of DNA methylation have been demonstrated, and DNA is less prone to degradation than proteins or RNA. Previous studies have reported several body fluid-specific DNA methylation markers.The presence or absence of a methyl group on the 5’ carbon of the cytosine pyridine ring in CpG dinucleotide regions called ‘CpG islands’ dictates whether the gene is expressed or silenced in the particular body fluid. Were described methylation patterns at tissue specific differentially methylated regions (tDMRs) to be stable and specific, making them excellent markers for tissue identification. The results demonstrate that methylation-based tissue identification is more than a proof-of-concept. The methodology holds promise as another viable forensic DNA analysis tool for characterization of biological materials.

Keywords: DNA methylation, forensic science, epigenome, tDMRs

Procedia PDF Downloads 423
135 Normal Weight Obesity among Female Students: BMI as a Non-Sufficient Tool for Obesity Assessment

Authors: Krzysztof Plesiewicz, Izabela Plesiewicz, Krzysztof Chiżyński, Marzenna Zielińska

Abstract:

Background: Obesity is an independent risk factor for cardiovascular diseases. There are several anthropometric parameters proposed to estimate the level of obesity, but until now there is no agreement which one is the best predictor of cardiometabolic risk. Scientists defined metabolically obese normal weight, who suffer from metabolic abnormalities, the same as obese individuals, and defined this syndrome as normal weight obesity (NWO). Aim of the study: The aim of our study was to determine the occurrence of overweight and obesity in a cohort of young, adult women, using standard and complementary methods of obesity assessment and to indicate those, who are at risk of obesity. The second aim of our study was to test additional methods of obesity assessment and proof that body mass index using alone is not sufficient parameter of obesity assessment. Materials and methods: 384 young women, aged 18-32, were enrolled into the study. Standard anthropometric parameters (waist to hips ratio (WTH), waist to height ratio (WTHR)) and two other methods of body fat percentage measurement (BFPM) were used in the study: electrical bioimpendance analysis (BIA) and skinfold measurement test by digital fat body mass clipper (SFM). Results: In the study group 5% and 7% of participants had waist to hips ratio and accordingly waist to height ratio values connected with visceral obesity. According to BMI 14% participants were overweight and obese. Using additional methods of body fat assessment, there were 54% and 43% of obese for BIA and SMF method. In the group of participants with normal BMI and underweight (not overweight, n =340) there were individuals with the level of BFPM above the upper limit, for the BIA 49% (n =164) and for the SFM 36 % (n=125). Statistical analysis revealed strong correlation between BIA and SFM methods. Conclusion: BMI using alone is not a sufficient parameter of obesity assessment. High percentage of young women with normal BMI values seem to be normal weight obese.

Keywords: electrical bioimpedance, normal weight obesity, skin-fold measurement test, women

Procedia PDF Downloads 268
134 Liquidity Risk of Banks in Light of a Dominant Share of Foreign Capital in the Polish Banking Sector

Authors: Karolina Patora

Abstract:

This article investigates liquidity risk management by banks, which has gained significant importance since the global financial crisis of 2008. The issue is of particular interest for countries like Poland, in which foreign capital plays a dominant role. Such an ownership structure poses certain risks to the local banking sector, which faces an increased probability of the withdrawal of funding or assets’ transfers abroad in case of a crisis. Both these factors can have a detrimental influence on the liquidity position of foreign-owned banks and hence negatively affect the financial stability of the whole banking sector. The aim of this study is to evaluate the impact of a dominating share of foreign investors in the Polish banking sector on the liquidity position of commercial banks. The study hypothesizes that the ownership structure of the Polish banking sector, in which there are banks predominantly controlled by foreign investors, does not pose a threat to the liquidity position of Polish banks. A supplementary research hypothesis is that the liquidity risk profile of foreign-owned banks differs from that of domestic banks. The sample consists of 14 foreign-owned banks and 5 domestic banks owned by local investors, which together constitute approximately 87% of the banking sector’s assets. The data covers the period of 2004–2014. The results of the regression models show no evidence of significant differences in terms of the dynamics of changes of the liquidity buffers between the foreign-owned and domestic banks, although the signs of the coefficients might suggest that the foreign-owned banks were decreasing the holdings of liquid assets at a slower pace over the examined period, compared to the domestic banks. However, no proof of the statistical significance of these findings has been found. The supplementary research hypothesis that the liquidity risk profile of foreign-controlled banks differs from that of domestic banks was rejected.

Keywords: foreign-owned banks, liquidity position, liquidity risk, financial stability

Procedia PDF Downloads 290
133 Proof of Concept of Video Laryngoscopy Intubation: Potential Utility in the Pre-Hospital Environment by Emergency Medical Technicians

Authors: A. Al Hajeri, M. E. Minton, B. Haskins, F. H. Cummins

Abstract:

The pre-hospital endotracheal intubation is fraught with difficulties; one solution offered has been video laryngoscopy (VL) which permits better visualization of the glottis than the standard method of direct laryngoscopy (DL). This method has resulted in a higher first attempt success rate and fewer failed intubations. However, VL has mainly been evaluated by experienced providers (experienced anesthetists), and as such the utility of this device for those whom infrequently intubate has not been thoroughly assessed. We sought to evaluate this equipment to determine whether in the hands of novice providers this equipment could prove an effective airway management adjunct. DL and two VL methods (C-Mac with distal screen/C-Mac with attached screen) were evaluated by simulating practice on a Laerdal airway management trainer manikin. Twenty Emergency Medical Technicians (basics) were recruited as novice practitioners. This group was used to eliminate bias, as these clinicians had no pre-hospital experience of intubation (although they did have basic airway skills). The following areas were assessed: Time taken to intubate, number of attempts required to successfully intubate, ease of use of equipment VL (attached screen) took on average longer for novice clinicians to successfully intubate and had a lower success rate and reported higher rating of difficulty compared to DL. However, VL (with distal screen) and DL were comparable on intubation times, success rate, gastric inflation rate and rating of difficulty by the user. This study highlights the routine use of VL by inexperienced clinicians would be of no added benefit over DL. Further studies are required to determine whether Emergency Medical Technicians (Paramedics) would benefit from this airway adjunct, and ascertain whether after initial mastery of VL (with a distal screen), lower intubation times and difficulty rating may be achievable.

Keywords: direct laryngoscopy, endotracheal intubation, pre-hospital, video laryngoscopy

Procedia PDF Downloads 406
132 A Pragmatic Approach of Memes Created in Relation to the COVID-19 Pandemic

Authors: Alexandra-Monica Toma

Abstract:

Internet memes are an element of computer mediated communication and an important part of online culture that combines text and image in order to generate meaning. This term coined by Richard Dawkings refers to more than a mere way to briefly communicate ideas or emotions, thus naming a complex and an intensely perpetuated phenomenon in the virtual environment. This paper approaches memes as a cultural artefact and a virtual trope that mirrors societal concerns and issues, and analyses the pragmatics of their use. Memes have to be analysed in series, usually relating to some image macros, which is proof of the interplay between imitation and creativity in the memes’ writing process. We believe that their potential to become viral relates to three key elements: adaptation to context, reference to a successful meme series, and humour (jokes, irony, sarcasm), with various pragmatic functions. The study also uses the concept of multimodality and stresses how the memes’ text interacts with the image, discussing three types of relations: symmetry, amplification, and contradiction. Moreover, the paper proves that memes could be employed as speech acts with illocutionary force, when the interaction between text and image is enriched through the connection to a specific situation. The features mentioned above are analysed in a corpus that consists of memes related to the COVID-19 pandemic. This corpus shows them to be highly adaptable to context, which helps build the feeling of connection and belonging in an otherwise tremendously fragmented world. Some of them are created based on well-known image macros, and their humour results from an intricate dialogue between texts and contexts. Memes created in relation to the COVID-19 pandemic can be considered speech acts and are often used as such, as proven in the paper. Consequently, this paper tackles the key features of memes, makes a thorough analysis of the memes sociocultural, linguistic, and situational context, and emphasizes their intertextuality, with special accent on their illocutionary potential.

Keywords: context, memes, multimodality, speech acts

Procedia PDF Downloads 196
131 A Benchmark System for Testing Medium Voltage Direct Current (MVDC-CB) Robustness Utilizing Real Time Digital Simulation and Hardware-In-Loop Theory

Authors: Ali Kadivar, Kaveh Niayesh

Abstract:

The integration of green energy resources is a major focus, and the role of Medium Voltage Direct Current (MVDC) systems is exponentially expanding. However, the protection of MVDC systems against DC faults is a challenge that can have consequences on reliable and safe grid operation. This challenge reveals the need for MVDC circuit breakers (MVDC CB), which are in infancies of their improvement. Therefore will be a lack of MVDC CBs standards, including thresholds for acceptable power losses and operation speed. To establish a baseline for comparison purposes, a benchmark system for testing future MVDC CBs is vital. The literatures just give the timing sequence of each switch and the emphasis is on the topology, without in-depth study on the control algorithm of DCCB, as the circuit breaker control system is not yet systematic. A digital testing benchmark is designed for the Proof-of-concept of simulation studies using software models. It can validate studies based on real-time digital simulators and Transient Network Analyzer (TNA) models. The proposed experimental setup utilizes data accusation from the accurate sensors installed on the tested MVDC CB and through general purpose input/outputs (GPIO) from the microcontroller and PC Prototype studies in the laboratory-based models utilizing Hardware-in-the-Loop (HIL) equipment connected to real-time digital simulators is achieved. The improved control algorithm of the circuit breaker can reduce the peak fault current and avoid arc resignation, helping the coordination of DCCB in relay protection. Moreover, several research gaps are identified regarding case studies and evaluation approaches.

Keywords: DC circuit breaker, hardware-in-the-loop, real time digital simulation, testing benchmark

Procedia PDF Downloads 75
130 Evaluation of Simulated Noise Levels through the Analysis of Temperature and Rainfall: A Case Study of Nairobi Central Business District

Authors: Emmanuel Yussuf, John Muthama, John Ng'ang'A

Abstract:

There has been increasing noise levels all over the world in the last decade. Many factors contribute to this increase, which is causing health related effects to humans. Developing countries are not left out of the whole picture as they are still growing and advancing their development. Motor vehicles are increasing on urban roads; there is an increase in infrastructure due to the rising population, increasing number of industries to provide goods and so many other activities. All this activities lead to the high noise levels in cities. This study was conducted in Nairobi’s Central Business District (CBD) with the main objective of simulating noise levels in order to understand the noise exposed to the people within the urban area, in relation to weather parameters namely temperature, rainfall and wind field. The study was achieved using the Neighbourhood Proximity Model and Time Series Analysis, with data obtained from proxies/remotely-sensed from satellites, in order to establish the levels of noise exposed to which people of Nairobi CBD are exposed to. The findings showed that there is an increase in temperature (0.1°C per year) and a decrease in precipitation (40 mm per year), which in comparison to the noise levels in the area, are increasing. The study also found out that noise levels exposed to people in Nairobi CBD were roughly between 61 and 63 decibels and has been increasing, a level which is high and likely to cause adverse physical and psychological effects on the human body in which air temperature, precipitation and wind contribute so much in the spread of noise. As a noise reduction measure, the use of sound proof materials in buildings close to busy roads, implementation of strict laws to most emitting sources as well as further research on the study was recommended. The data used for this study ranged from the year 2000 to 2015, rainfall being in millimeters (mm), temperature in degrees Celsius (°C) and the urban form characteristics being in meters (m).

Keywords: simulation, noise exposure, weather, proxy

Procedia PDF Downloads 374
129 Quantification and Evaluation of Tumors Heterogeneity Utilizing Multimodality Imaging

Authors: Ramin Ghasemi Shayan, Morteza Janebifam

Abstract:

Tumors are regularly inhomogeneous. Provincial varieties in death, metabolic action, multiplication and body part are watched. There’s expanding proof that strong tumors may contain subpopulations of cells with various genotypes and phenotypes. These unmistakable populaces of malignancy cells can connect during a serious way and may contrast in affectability to medications. Most tumors show organic heterogeneity1–3 remembering heterogeneity for genomic subtypes, varieties inside the statement of development variables and genius, and hostile to angiogenic factors4–9 and varieties inside the tumoural microenvironment. These can present as contrasts between tumors in a few people. for instance, O6-methylguanine-DNA methyltransferase, a DNA fix compound, is hushed by methylation of the quality advertiser in half of glioblastoma (GBM), adding to chemosensitivity, and improved endurance. From the outset, there includes been specific enthusiasm inside the usage of dissemination weighted imaging (DWI) and dynamic complexity upgraded MRI (DCE-MRI). DWI sharpens MRI to water dispersion inside the extravascular extracellular space (EES) and is wiped out with the size and setup of the cell populace. Additionally, DCE-MRI utilizes dynamic obtaining of pictures during and after the infusion of intravenous complexity operator. Signal changes are additionally changed to outright grouping of differentiation permitting examination utilizing pharmacokinetic models. PET scan modality gives one of a kind natural particularity, permitting dynamic or static imaging of organic atoms marked with positron emanating isotopes (for example, 15O, 18F, 11C). The strategy is explained to a colossal radiation portion, which points of confinement rehashed estimations, particularly when utilized together with PC tomography (CT). At long last, it's of incredible enthusiasm to quantify territorial hemoglobin state, which could be joined with DCE-CT vascular physiology estimation to create significant experiences for understanding tumor hypoxia.

Keywords: heterogeneity, computerized tomography scan, magnetic resonance imaging, PET

Procedia PDF Downloads 144
128 Between the House and the City: An Investigation of the Structure of the Family/Society and the Role of the Public Housing in Tokyo and Berlin

Authors: Abudjana Babiker

Abstract:

The middle of twenty century witnessed an explosion in public housing. After the great depression, some of the capitalists and communist countries have launched policies and programs to produce public housing in the urban areas. Concurrently, modernity was the leading architecture style at the time excessively supported the production, and principally was the instrument for the success of the public housing program due to the modernism manifesto for manufactured architecture as an international style that serves the society and parallelly connect it to the other design industries which allowed for the production of the architecture elements. After the second world war, public housing flourished, especially in communist’s countries. The idea of public housing was conceived as living spaces at the time, while the Workplaces performed as the place for production and labor. Michel Foucault - At the end of the twenty century- the introduction of biopolitics has had highlighted the alteration in the production and labor inter-function. The house does not precisely perform as the sanctuary, from the production, for the family, it opens the house to be -part of the city as- a space for production, not only to produce objects but to reproduce the family as a total part of the production mechanism in the city. While the public housing kept altering from one country to another after the failure of the modernist’s public housing in the late 1970s, the society continued changing parallelly with the socio-economic condition in each political-economical system, and the public housing thus followed. The family structure in the major cities has been dramatically changing, single parenting and the long working hours, for instance, have been escalating the loneliness in the major cities such as London, Berlin, and Tokyo and the public housing for the families is no longer suits the single lifestyle for the individuals. This Paper investigates the performance of both the single/individual lifestyle and the family/society structure in Tokyo and Berlin in a relation to the utilization of public housing under economical policies and the socio-political environment that produced the individuals and the collective. The study is carried through the study of the undercurrent individual/society and case studies to examine the performance of the utilization of the housing. The major finding is that the individual/collective are revolving around the city; the city identified and acts as a system that magnetized and blurred the line between production and reproduction lifestyle. The mass public housing for families is shifting to be a combination between neo-liberalism and socialism housing.

Keywords: loneliness, production reproduction, work live, publichousing

Procedia PDF Downloads 184
127 Use of Giant Magneto Resistance Sensors to Detect Micron to Submicron Biologic Objects

Authors: Manon Giraud, Francois-Damien Delapierre, Guenaelle Jasmin-Lebras, Cecile Feraudet-Tarisse, Stephanie Simon, Claude Fermon

Abstract:

Early diagnosis or detection of harmful substances at low level is a growing field of high interest. The ideal test should be cheap, easy to use, quick, reliable, specific, and with very low detection limit. Combining the high specificity of antibodies-functionalized magnetic beads used to immune-capture biologic objects and the high sensitivity of a GMR-based sensors, it is possible to even detect these biologic objects one by one, such as a cancerous cell, a bacteria or a disease biomarker. The simplicity of the detection process makes its use possible even for untrained staff. Giant Magneto Resistance (GMR) is a recently discovered effect consisting in the electrical resistance modification of some conductive layers when exposed to a magnetic field. This effect allows the detection of very low variations of magnetic field (typically a few tens of nanoTesla). Magnetic nanobeads coated with antibodies targeting the analytes are mixed with a biological sample (blood, saliva) and incubated for 45 min. Then the mixture is injected in a very simple microfluidic chip and circulates above a GMR sensor that detects changes in the surrounding magnetic field. Magnetic particles do not create a field sufficient to be detected. Therefore, only the biological objects surrounded by several antibodies-functionalized magnetic beads (that have been captured by the complementary antigens) are detected when they move above the sensor. Proof of concept has been carried out on NS1 mouse cancerous cells diluted in PBS which have been bonded to magnetic 200nm particles. Signals were detected in cells-containing samples while none were recorded for negative controls. Binary response was hence assessed for this first biological model. The precise quantification of the analytes and its detection in highly diluted solution is the step now in progress.

Keywords: early diagnosis, giant magnetoresistance, lab-on-a-chip, submicron particle

Procedia PDF Downloads 245
126 Occurrence of Pharmaceutical Compounds in an Urban Lake

Authors: J. D. Villanueva, N. Peyraube, I. Allan, G. D. Salvosa, M. Reid, C. Harman, K. D. Salvosa, J. M. V. Castro, M. V. O. Espaldon, J. B. Sevilla-Nastor, P. Le Coustumer

Abstract:

The main objectives of this research are to (1) assess the occurrence of the pharmaceutical compounds and (2) present the environmental challenges posed by the existence of these pharmaceutical compounds in the surface water. These pharmaceuticals were measured in Napindan Lake, Philippines. This lake is not only a major tributary of the Pasig River (an estuary) and Laguna Lake (freshwater). It also joins these two important surface waters of the National Capital Region. Pharmaceutical compounds such as Atenolol, Carbamazepine, and two other over the counter medicines: Cetirizine, and Ibuprofen were measured in Napindan Lake. Atenolol is a beta blocker that helps in lowering hypertensions. Carbamazepine is an anticonvulsant used as treatment for epilepsy and neuropathic pain. Cetirizine is an antihistamine that can relieve allergies. Ibuprofen is a non-steroidal anti-inflammatory drug normally used to relieve pains. Three different climatological conditions with corresponding hydro physico chemical characteristics were considered. First, was during a dry season with a simultaneous dredging. Second was during a transition period from dry to wet season. Finally, the third was during a continuous wet event. Based from the results of the study, most of these pharmaceuticals can be found in Napindan Lake. This is a proof that these pharmaceutical compounds are being released to a natural surface water. Even though climatological conditions were different, concentrations of these pharmaceuticals can still be detected. This implies that there is an incessant supply of these pharmaceutical compounds in Napindan Lake. Chronic exposure to these compounds even at low concentrations can lead to possible environmental and health risks. Given this information and since consistent occurrence of these compounds can be expected, the main challenge, at present, is on how to control the sources of these pharmaceutical compounds. Primarily, there is a need to manage the disposal of the pharmaceutical compounds. Yet, the main question is how to? This study would like to present the challenges and institutional roles in helping manage the pharmaceutical disposals in a developing country like the Philippines.

Keywords: atenolol, carbamazepine, cetirizine, ibuprofen, institutional roles, Napindan lake, pharmaceutical compound disposal management, surface water, urban lake

Procedia PDF Downloads 160
125 Analyzing the Visual Capability of the Siberian Husky Breed of the Common Dog (Canis lupus familiaris) to Detect Terminally-Ill Patients Undergoing Palliative Care

Authors: Maximo Cozzetti

Abstract:

The aim is to evaluate the capability of the 'Siberian Husky' (FCI-Standard Nº 270) breed of the common dog (Canis lupus familiaris) to detect terminally-ill human patients undergoing palliative care. A total of 49 such patients that fulfill the 'National Scientific and Technical Research Council–Ethical Principles for the Behavior of the Scientific and Technical Investigator' policy, (mainly affected with Stage IV Hodgkin lymphoma or Stage IV Carcinoma, though various other terminal diseases were present) and 49 controls were enrolled. A total of 13 specimens of Siberian Huskies (Canis lupus familiaris FCI – Standard Nº 270) were selected. After a conditioning training regime in which the canines were rewarded when identifying terminally ill patients and excluding the control subjects, a double-blind experiment was conducted in which the canines were presented with a previously unknown patient through an olfactory-proof plexiglass window for 2-minute intervals. The test subjects correctly identified 89.80% of the humans as either ‘ill’ or ‘healthy’. It is important to note that both groups of humans were selected considering and preventing confounding and self-identifying factors such as age, ethnicity, clothing, posture, skin color, alopecia (chemotherapy-induced or otherwise), etc. The olfactory-proofing of the test area rules out the use of the sense of smell to detect distinctive drugs or bodily odors that may be associated with terminal diseases. Thus, the Siberian Husky breed of the common dog shows the visual capability to detect and identify terminally ill patients undergoing palliative care regardless of age, posture, and quantity of hair. Though the capability of the breed of dog to detect terminally-ill patients was observed thoroughly during the course of the experiments, the exact process by which the canines identify the test subjects remains unknown and further research is encouraged.

Keywords: Canis lupus familiaris, Siberian Husky, visual identification of terminall illness, FCI-Standard Nº270

Procedia PDF Downloads 153
124 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System

Authors: Nareshkumar Harale, B. B. Meshram

Abstract:

The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.

Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design

Procedia PDF Downloads 224
123 Confidence Building Strategies Adopted in an EAP Speaking Course at METU and Their Effectiveness: A Case Study

Authors: Canan Duzan

Abstract:

For most language learners, mastery of the speaking skill is the proof of the mastery of the foreign language. On the other hand, the speaking skill is considered as the most difficult aspect of language learning to develop for both learners and teachers. Especially in countries like Turkey where exposure to the target language is minimum and resources and opportunities provided for language practice are scarce, teaching and learning to speak the language become a real struggle for teachers and learners alike. Data collected from students, instructors, faculty members and the business sector in needs analysis studies conducted previously at Middle East Technical University (METU) consistently revealed the need for addressing the problem of lack of confidence in speaking English. Action was taken during the design of the only EAP speaking course offered in Modern Languages Department since lack of confidence is considered to be a serious barrier for effective communication and causes learners to suffer from insecurity, uncertainty and fear. “Confidence building” served as the guiding principle in the syllabus design, nature of the tasks created for the course and the assessment procedures to help learners become more confident speakers of English. In order to see the effectiveness of the decisions made during the design phase of the course and whether students become more confident speakers upon completion of the course, a case study was carried out with 100 students at METU. A questionnaire including both Likert-Scale and open-ended items were administered to students to collect data and this data were analyzed using the SPSS program. Group interviews were also carried out to gain more insight into the effectiveness of the course in terms of building speaking confidence. This presentation will explore the specific actions taken to develop students’ confidence based on the findings of program evaluation studies and to what extent the students believe these actions to be effective in improving their confidence. The unique design of this course and strategies adopted for confidence building are highly applicable in other EAP contexts and may yield similar positive results.

Keywords: confidence, EAP, speaking, strategy

Procedia PDF Downloads 396
122 Enzymatic Hydrolysis of Sugar Cane Bagasse Using Recombinant Hemicellulases

Authors: Lorena C. Cintra, Izadora M. De Oliveira, Amanda G. Fernandes, Francieli Colussi, Rosália S. A. Jesuíno, Fabrícia P. Faria, Cirano J. Ulhoa

Abstract:

Xylan is the main component of hemicellulose and for its complete degradation is required cooperative action of a system consisting of several enzymes including endo-xylanases (XYN), β-xylosidases (XYL) and α-L-arabinofuranosidases (ABF). The recombinant hemicellulolytic enzymes an endoxylanase (HXYN2), β-xylosidase (HXYLA), and an α-L-arabinofuranosidase (ABF3) were used in hydrolysis tests. These three enzymes are produced by filamentous fungi and were expressed heterologously and produced in Pichia pastoris previously. The aim of this work was to evaluate the effect of recombinant hemicellulolytic enzymes on the enzymatic hydrolysis of sugarcane bagasse (SCB). The interaction between the three recombinant enzymes during SCB pre-treated by steam explosion hydrolysis was performed with different concentrations of HXYN2, HXYLA and ABF3 in different ratios in according to a central composite rotational design (CCRD) 23, including six axial points and six central points, totaling 20 assays. The influence of the factors was assessed by analyzing the main effects and interaction between the factors, calculated using Statistica 8.0 software (StatSoft Inc. Tulsa, OK, USA). The Pareto chart was constructed with this software and showed the values of the Student’s t test for each recombinant enzyme. It was considered as response variable the quantification of reducing sugars by DNS (mg/mL). The Pareto chart showed that the recombinant enzyme ABF3 exerted more significant effect during SCB hydrolysis, with higher concentrations and with the lowest concentration of this enzyme. It was performed analysis of variance according to Fisher method (ANOVA). In ANOVA for the release of reducing sugars (mg/ml) as the variable response, the concentration of ABF3 showed significance during hydrolysis SCB. The result obtained by ANOVA, is in accordance with those presented in the analysis method based on the statistical Student's t (Pareto chart). The degradation of the central chain of xylan by HXYN2 and HXYLA was more strongly influenced by ABF3 action. A model was obtained, and it describes the performance of the interaction of all three enzymes for the release of reducing sugars, and can be used to better explain the results of the statistical analysis. The formulation capable of releasing the higher levels of reducing sugars had the following concentrations: HXYN2 with 600 U/g of substrate, HXYLA with 11.5 U.g-1 and ABF3 with 0.32 U.g-1. In conclusion, the recombinant enzyme that has a more significant effect during SCB hydrolysis was ABF3. It is noteworthy that the xylan present in the SCB is arabinoglucoronoxylan, due to this fact debranching enzymes are important to allow access of enzymes that act on the central chain.

Keywords: experimental design, hydrolysis, recombinant enzymes, sugar cane bagasse

Procedia PDF Downloads 226
121 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 91
120 Understanding the Dynamics of Human-Snake Negative Interactions: A Study of Indigenous Perceptions in Tamil Nadu, Southern India

Authors: Ramesh Chinnasamy, Srishti Semalty, Vishnu S. Nair, Thirumurugan Vedagiri, Mahesh Ganeshan, Gautam Talukdar, Karthy Sivapushanam, Abhijit Das

Abstract:

Snakes form an integral component of ecological systems. Human population explosion and associated acceleration of habitat destruction and degradation, has led to a rapid increase in human-snake encounters. The study aims at understanding the level of awareness, knowledge, and attitude of the people towards human-snake negative interaction and role of awareness programmes in the Moyar river valley, Tamil Nadu. The study area is part of the Mudumalai and the Sathyamangalam Tiger Reserves, which are significant wildlife corridors between the Western Ghats and the Eastern Ghats in the Nilgiri Biosphere Reserve. The data was collected using questionnaire covering 644 respondents spread across 18 villages between 2018 and 2019. The study revealed that 86.5% of respondents had strong negative perceptions towards snakes which were propelled by fear, superstitions, and threat of snakebite which was common and did not vary among different villages (F=4.48; p = <0.05) and age groups (X2 = 1.946; p = 0.962). Cobra 27.8% (n = 294) and rat snake 21.3% (n = 225) were the most sighted species and most snake encounter occurred during the monsoon season i.e., July 35.6 (n = 218), June 19.1% (n = 117) and August 18.4% (n = 113). At least 1 out of 5 respondents was reportedly bitten by snakes during their lifetime. The most common species of snakes that were the cause of snakebite were Saw scaled viper (32.6%, n = 42) followed by Cobra 17.1% (n = 22). About 21.3% (n = 137) people reported livestock loss due to pythons and other snakes 21.3% (n = 137). Most people, preferred medical treatment for snakebite (87.3%), whereas 12.7%, still believed in traditional methods. The majority (82.3%) used precautionary measure by keeping traditional items such as garlic, kerosene, and snake plant to avoid snakes. About 30% of the respondents expressed need for technical and monetary support from the forest department that could aid in reducing the human-snake conflict. It is concluded that the general perception in the study area is driven by fear and negative attitude towards snakes. Though snakes such as Cobra were widely worshiped in the region, there are still widespread myths and misconceptions that have led to the irrational killing of snakes. Awareness and innovative education programs rooted in the local context and language should be integrated at the village level, to minimize risk and the associated threat of snakebite among the people. Results from this study shall help policy makers to devise appropriate conservation measures to reduce human-snake conflicts in India.

Keywords: Envenomation, Health-Education, Human-Wildlife Conflict, Neglected Tropical Disease, Snakebite Mitigation, Traditional Practitioners

Procedia PDF Downloads 219
119 Milling Simulations with a 3-DOF Flexible Planar Robot

Authors: Hoai Nam Huynh, Edouard Rivière-Lorphèvre, Olivier Verlinden

Abstract:

Manufacturing technologies are becoming continuously more diversified over the years. The increasing use of robots for various applications such as assembling, painting, welding has also affected the field of machining. Machining robots can deal with larger workspaces than conventional machine-tools at a lower cost and thus represent a very promising alternative for machining applications. Furthermore, their inherent structure ensures them a great flexibility of motion to reach any location on the workpiece with the desired orientation. Nevertheless, machining robots suffer from a lack of stiffness at their joints restricting their use to applications involving low cutting forces especially finishing operations. Vibratory instabilities may also happen while machining and deteriorate the precision leading to scrap parts. Some researchers are therefore concerned with the identification of optimal parameters in robotic machining. This paper continues the development of a virtual robotic machining simulator in order to find optimized cutting parameters in terms of depth of cut or feed per tooth for example. The simulation environment combines an in-house milling routine (DyStaMill) achieving the computation of cutting forces and material removal with an in-house multibody library (EasyDyn) which is used to build a dynamic model of a 3-DOF planar robot with flexible links. The position of the robot end-effector submitted to milling forces is controlled through an inverse kinematics scheme while controlling the position of its joints separately. Each joint is actuated through a servomotor for which the transfer function has been computed in order to tune the corresponding controller. The output results feature the evolution of the cutting forces when the robot structure is deformable or not and the tracking errors of the end-effector. Illustrations of the resulting machined surfaces are also presented. The consideration of the links flexibility has highlighted an increase of the cutting forces magnitude. This proof of concept will aim to enrich the database of results in robotic machining for potential improvements in production.

Keywords: control, milling, multibody, robotic, simulation

Procedia PDF Downloads 241
118 Enhancing Robustness in Federated Learning through Decentralized Oracle Consensus and Adaptive Evaluation

Authors: Peiming Li

Abstract:

This paper presents an innovative blockchain-based approach to enhance the reliability and efficiency of federated learning systems. By integrating a decentralized oracle consensus mechanism into the federated learning framework, we address key challenges of data and model integrity. Our approach utilizes a network of redundant oracles, functioning as independent validators within an epoch-based training system in the federated learning model. In federated learning, data is decentralized, residing on various participants' devices. This scenario often leads to concerns about data integrity and model quality. Our solution employs blockchain technology to establish a transparent and tamper-proof environment, ensuring secure data sharing and aggregation. The decentralized oracles, a concept borrowed from blockchain systems, act as unbiased validators. They assess the contributions of each participant using a Hidden Markov Model (HMM), which is crucial for evaluating the consistency of participant inputs and safeguarding against model poisoning and malicious activities. Our methodology's distinct feature is its epoch-based training. An epoch here refers to a specific training phase where data is updated and assessed for quality and relevance. The redundant oracles work in concert to validate data updates during these epochs, enhancing the system's resilience to security threats and data corruption. The effectiveness of this system was tested using the Mnist dataset, a standard in machine learning for benchmarking. Results demonstrate that our blockchain-oriented federated learning approach significantly boosts system resilience, addressing the common challenges of federated environments. This paper aims to make these advanced concepts accessible, even to those with a limited background in blockchain or federated learning. We provide a foundational understanding of how blockchain technology can revolutionize data integrity in decentralized systems and explain the role of oracles in maintaining model accuracy and reliability.

Keywords: federated learning system, block chain, decentralized oracles, hidden markov model

Procedia PDF Downloads 56
117 Computational Team Dynamics in Student New Product Development Teams

Authors: Shankaran Sitarama

Abstract:

Teamwork is an extremely effective pedagogical tool in engineering education. New Product Development (NPD) has been an effective strategy of companies to streamline and bring innovative products and solutions to customers. Thus, Engineering curriculum in many schools, some collaboratively with business schools, have brought NPD into the curriculum at the graduate level. Teamwork is invariably used during instruction, where students work in teams to come up with new products and solutions. There is a significant emphasis of grade on the semester long teamwork for it to be taken seriously by students. As the students work in teams and go through this process to develop the new product prototypes, their effectiveness and learning to a great extent depends on how they function as a team and go through the creative process, come together, and work towards the common goal. A core attribute of a successful NPD team is their creativity and innovation. The team needs to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas resulting in a POC (proof-of-concept) implementation or a prototype of the product. The simultaneous requirement of teams to be creative and at the same time also converge and work together imposes different types of tensions in their team interactions. These ideational tensions / conflicts and sometimes relational tensions / conflicts are inevitable. Effective teams will have to deal with the Team dynamics and manage it to be resilient enough and yet be creative. This research paper provides a computational analysis of the teams’ communication that is reflective of the team dynamics, and through a superimposition of latent semantic analysis with social network analysis, provides a computational methodology of arriving at patterns of visual interaction. These team interaction patterns have clear correlations to the team dynamics and provide insights into the functioning and thus the effectiveness of the teams. 23 student NPD teams over 2 years of a course on Managing NPD that has a blend of engineering and business school students is considered, and the results are presented. It is also correlated with the teams’ detailed and tailored individual and group feedback and self-reflection and evaluation questionnaire.

Keywords: team dynamics, social network analysis, team interaction patterns, new product development teamwork, NPD teams

Procedia PDF Downloads 109
116 Towards a Strategic Framework for State-Level Epistemological Functions

Authors: Mark Darius Juszczak

Abstract:

While epistemology, as a sub-field of philosophy, is generally concerned with theoretical questions about the nature of knowledge, the explosion in digital media technologies has resulted in an exponential increase in the storage and transmission of human information. That increase has resulted in a particular non-linear dynamic – digital epistemological functions are radically altering how and what we know. Neither the rate of that change nor the consequences of it have been well studied or taken into account in developing state-level strategies for epistemological functions. At the current time, US Federal policy, like that of virtually all other countries, maintains, at the national state level, clearly defined boundaries between various epistemological agencies - agencies that, in one way or another, mediate the functional use of knowledge. These agencies can take the form of patent and trademark offices, national library and archive systems, departments of education, departments such as the FTC, university systems and regulations, military research systems such as DARPA, federal scientific research agencies, medical and pharmaceutical accreditation agencies, federal funding for scientific research and legislative committees and subcommittees that attempt to alter the laws that govern epistemological functions. All of these agencies are in the constant process of creating, analyzing, and regulating knowledge. Those processes are, at the most general level, epistemological functions – they act upon and define what knowledge is. At the same time, however, there are no high-level strategic epistemological directives or frameworks that define those functions. The only time in US history where a proxy state-level epistemological strategy existed was between 1961 and 1969 when the Kennedy Administration committed the United States to the Apollo program. While that program had a singular technical objective as its outcome, that objective was so technologically advanced for its day and so complex so that it required a massive redirection of state-level epistemological functions – in essence, a broad and diverse set of state-level agencies suddenly found themselves working together towards a common epistemological goal. This paper does not call for a repeat of the Apollo program. Rather, its purpose is to investigate the minimum structural requirements for a national state-level epistemological strategy in the United States. In addition, this paper also seeks to analyze how the epistemological work of the multitude of national agencies within the United States would be affected by such a high-level framework. This paper is an exploratory study of this type of framework. The primary hypothesis of the author is that such a function is possible but would require extensive re-framing and reclassification of traditional epistemological functions at the respective agency level. In much the same way that, for example, DHS (Department of Homeland Security) evolved to respond to a new type of security threat in the world for the United States, it is theorized that a lack of coordination and alignment in epistemological functions will equally result in a strategic threat to the United States.

Keywords: strategic security, epistemological functions, epistemological agencies, Apollo program

Procedia PDF Downloads 70
115 Improving Usability of e-Government for the Elderly

Authors: Tamas Molnar

Abstract:

Electronic government systems are currently in the same development stage as e-commerce applications were about in the late 1990s. Wide adoption by the majority of population is near, as such services are not only more and more desired by the users, but also strongly advocated and pushed by the state, as a means to increase effectiveness and cut expenses at the same time. Diffusion is however hampered by the low motivation caused by usability issues which will cause more and more frustration as the general population ages. Usability centred design is essential when creating such services. Elderly users, who have statistically the least experience, have the most problems, and therefore reject unusable systems first. The goal of our research was to find a way to map the needs of the elderly and create guidelines for the design of electronic government systems which are usable for the whole population. The first phase of our research, started mid-2009, was centred on the idea to gather information about the needs of the target group, in both Germany and Hungary with over 70 participants. This was done with the help of scenarios, interviews and questionnaires. The supplied data enabled to choose an eGovernment system for tests on the target group. Tests conducted in Germany and Hungary were based on the design and functions of the German electronic ID card, in the native languages. Scenarios mirroring common, every day transactions requiring an identification procedure were used. The obtained results allowed us to develop a generalised solution, the IGUAN guideline. This guideline makes a standardised approach to the usability improvement process possible. It contains the special requirements of elderly users, and a catalogue of criteria, which helps to develop an application in line with the set requirements. The third phase of our research was used a proof of concept for the IGUAN. The guideline was evaluated and tested with an iterative prototyping. The successful completion of this phase indicates that the IGUAN can be used to measurably increase the acceptance of e-government systems by elderly users. We could therefore demonstrate that improvements in the interface make e-government application possible which are perceived useful and easy to use by elderly users. These improvements will measurably increase the user motivation and experience. This can however only be achieved with a structured design process, and requires a framework which takes the requirements of the elderly users into account.

Keywords: e-Government, usability, acceptance, guidelines

Procedia PDF Downloads 539
114 Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is defined as a closed subset contains real numbers. Then the inequalities of time scales version have received a lot of attention and has had a major field in both pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on double integrals to obtain new time-scale inequalities of Copson driven by Steklov operator. They will be applied in the solution of the Cauchy problem for the wave equation. The proof can be done by introducing restriction on the operator in several cases. In addition, the obtained inequalities done by using some concepts in time scale version such as time scales calculus, theorem of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of Hardy, inequality of Coposon, Steklov operator

Procedia PDF Downloads 74
113 Applications and Development of a Plug Load Management System That Automatically Identifies the Type and Location of Connected Devices

Authors: Amy Lebar, Kim L. Trenbath, Bennett Doherty, William Livingood

Abstract:

Plug and process loads (PPLs) account for 47% of U.S. commercial building energy use. There is a huge potential to reduce whole building consumption by targeting PPLs for energy savings measures or implementing some form of plug load management (PLM). Despite this potential, there has yet to be a widely adopted commercial PLM technology. This paper describes the Automatic Type and Location Identification System (ATLIS), a PLM system framework with automatic and dynamic load detection (ADLD). ADLD gives PLM systems the ability to automatically identify devices as they are plugged into the outlets of a building. The ATLIS framework takes advantage of smart, connected devices to identify device locations in a building, meter and control their power, and communicate this information to a central database. ATLIS includes five primary capabilities: location identification, communication, control, energy metering and data storage. A laboratory proof of concept (PoC) demonstrated all but the data storage capabilities and these capabilities were validated using an office building scenario. The PoC can identify when a device is plugged into an outlet and the location of the device in the building. When a device is moved, the PoC’s dashboard and database are automatically updated with the new location. The PoC implements controls to devices from the system dashboard so that devices maintain correct schedules regardless of where they are plugged in within a building. ATLIS’s primary technology application is improved PLM, but other applications include asset management, energy audits, and interoperability for grid-interactive efficient buildings. A system like ATLIS could also be used to direct power to critical devices, such as ventilators, during a brownout or blackout. Such a framework is an opportunity to make PLM more widespread and reduce the amount of energy consumed by PPLs in current and future commercial buildings.

Keywords: commercial buildings, grid-interactive efficient buildings (GEB), miscellaneous electric loads (MELs), plug loads, plug load management (PLM)

Procedia PDF Downloads 130
112 Exponential Stabilization of a Flexible Structure via a Delayed Boundary Control

Authors: N. Smaoui, B. Chentouf

Abstract:

The boundary stabilization problem of the rotating disk-beam system is a topic of interest in research studies. This system involves a flexible beam attached to the center of a disk, and the control and stabilization of this system have been extensively studied. This research focuses on the case where the center of mass is fixed in an inertial frame, and the rotation of the center is non-uniform. The system is represented by a set of nonlinear coupled partial differential equations and ordinary differential equations. The boundary stabilization problem of this system via a delayed boundary control is considered. We assume that the boundary control is either of a force type control or a moment type control and is subject to the presence of a constant time-delay. The aim of this research is threefold: First, we demonstrate that the rotating disk-beam system is well-posed in an appropriate functional space. Then, we establish the exponential stability property of the system. Finally, we provide numerical simulations that illustrate the theoretical findings. The research utilizes the semigroup theory to establish the well-posedness of the system. The resolvent method is then employed to prove the exponential stability property. Finally, the finite element method is used to demonstrate the theoretical results through numerical simulations. The research findings indicate that the rotating disk-beam system can be stabilized using a boundary control with a time delay. The proof of stability is based on the resolvent method and a variation of constants formula. The numerical simulations further illustrate the theoretical results. The findings have potential implications for the design and implementation of control strategies in similar systems. In conclusion, this research demonstrates that the rotating disk-beam system can be stabilized using a boundary control with time delay. The well-posedness and exponential stability properties are established through theoretical analysis, and these findings are further supported by numerical simulations. The research contributes to the understanding and practical application of control strategies for flexible structures, providing insights into the stability of rotating disk-beam systems.

Keywords: rotating disk-beam, delayed force control, delayed moment control, torque control, exponential stability

Procedia PDF Downloads 74