Search results for: multi sliding friction device
1936 Oral Versus Iontophoresis Nonsteroidal Anti-Inflammatory Drugs in Tennis Elbow
Authors: Moustafa Ali Elwan, Ibrahim Salem Abdelrafa, Ashraf Moharm
Abstract:
Nonsteroidal anti-inflammatory drugs (NSAIDs) are among the most commonly prescribed oral and topical drugs worldwide. Moreover, NSAIDs are responsible for most of all adverse drug reactions. For several decades, there are numerous attempts to use the cutaneous layers as a gate into the body for the local delivery of the therapeutic agent. Transdermal drug delivery is a validated technology contributing significantly to global pharmaceutical care. Transdermal Drug Delivery systems can be improved by using therapeutic agents. Moreover, Transdermal Drug Delivery systems can be improved by using chemical enhancers like ultrasound or iontophoresis. Iontophoresis provides a mechanism to enhance the penetration of hydrophilic and charged molecules across the skin. Objective: to compare the drug administration by ‘iontophoresis’ versus the oral rule. Methods: This study was conducted at the Faculty of Physical Therapy, Modern University for technology and information, Cairo, Egypt, on 20 participants (8 female & 12 male) who complained of tennis elbow. Their mean age was (25.45 ± 3.98) years, and all participants were assessed in many aspects: Pain threshold was assessed by algometer. Range of motion was assessed by electro goniometer, and isometric strength was assessed by a portable hand-held dynamometer. Then Participants were randomly assigned into two groups: group A was treated with oral NSAID (diclofenac) while group B was treated via administration of NSAIDs (diclofenac) via an iontophoresis device. All the participants were subjected to blood samples analysis in both pre-administration of the drug and post-administration of the drug for 24 hours (sample/every 6 hours). Results: The results demonstrated that there was a significant improvement in group b, “iontophoresis NSAIDs group,” more than in group B,” oral NSAIDs group,” in all measurements ‘ pain threshold, strength, and range of motion. Also, the iontophoresis method shows higher maximum plasma concentrations (Cmax) and concentration-time curves than the oral method.Keywords: diclofenac, iontophoresis, NSAIDs, oral, tennis elbow
Procedia PDF Downloads 1151935 Cerebral Pulsatility Mediates the Link Between Physical Activity and Executive Functions in Older Adults with Cardiovascular Risk Factors: A Longitudinal NIRS Study
Authors: Hanieh Mohammadi, Sarah Fraser, Anil Nigam, Frederic Lesage, Louis Bherer
Abstract:
A chronically higher cerebral pulsatility is thought to damage cerebral microcirculation, leading to cognitive decline in older adults. Although it is widely known that regular physical activity is linked to improvement in some cognitive domains, including executive functions, the mediating role of cerebral pulsatility on this link remains to be elucidated. This study assessed the impact of 6 months of regular physical activity upon changes in an optical index of cerebral pulsatility and the role of physical activity for the improvement of executive functions. 27 older adults (aged 57-79, 66.7% women) with cardiovascular risk factors (CVRF) were enrolled in the study. The participants completed the behavioral Stroop test, which was extracted from the Delis-Kaplan executive functions system battery at baseline (T0) and after 6 months (T6) of physical activity. Near-infrared spectroscopy (NIRS) was applied for an innovative approach to indexing cerebral pulsatility in the brain microcirculation at T0 and T6. The participants were at standing rest while a NIRS device recorded hemodynamics data from frontal and motor cortex subregions at T0 and T6. The cerebral pulsatility index of interest was cerebral pulse amplitude, which was extracted from the pulsatile component of NIRS data. Our data indicated that 6 months of physical activity was associated with a reduction in the response time for the executive functions, including inhibition (T0: 56.33± 18.2 to T6: 53.33± 15.7,p= 0.038)and Switching(T0: 63.05± 5.68 to T6: 57.96 ±7.19,p< 0.001) conditions of the Stroop test. Also, physical activity was associated with a reduction in cerebral pulse amplitude (T0: 0.62± 0.05 to T6: 0.55± 0.08, p < 0.001). Notably, cerebral pulse amplitude was a significant mediator of the link between physical activity and response to the Stroop test for both inhibition (β=0.33 (0.61,0.23),p< 0.05)and switching (β=0.42 (0.69,0.11),p <0.01) conditions. This study suggests that regular physical activity may support cognitive functions through the improvement of cerebral pulsatility in older adults with CVRF.Keywords: near-infrared spectroscopy, cerebral pulsatility, physical activity, cardiovascular risk factors, executive functions
Procedia PDF Downloads 1951934 From Text to Data: Sentiment Analysis of Presidential Election Political Forums
Authors: Sergio V Davalos, Alison L. Watkins
Abstract:
User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.Keywords: sentiment analysis, text mining, user generated content, US presidential elections
Procedia PDF Downloads 1921933 Wearable System for Prolonged Cooling and Dehumidifying of PPE in Hot Environments
Abstract:
While personal protective equipment (PPE) prevents the healthcare personnel from exposing to harmful surroundings, it creates a barrier to the dissipation of body heat and perspiration, leading to severe heat stress during prolonged exposure, especially in hot environments. It has been found that most of the existed personal cooling strategies have limitations in achieving effective cooling performance with long duration and lightweight. This work aimed to develop a lightweight (<1.0 kg) and less expensive wearable air cooling and dehumidifying system (WCDS) that can be applied underneath the protective clothing and provide 50W mean cooling power for more than 5 hours at 35°C environmental temperature without compromising the protection of PPE. For the WCDS, blowers will be used to activate an internal air circulation inside the clothing microclimate, which doesn't interfere with the protection of PPE. An air cooling and dehumidifying chamber (ACMR) with a specific design will be developed to reduce the air temperature and humidity inside the protective clothing. Then the cooled and dried air will be supplied to upper chest and back areas through a branching tubing system for personal cooling. A detachable ice cooling unit will be applied from the outside of the PPE to extract heat from the clothing microclimate. This combination allows for convenient replacement of the cooling unit to refresh the cooling effect, which can realize a continuous cooling function without taking off the PPE or adding too much weight. A preliminary thermal manikin test showed that the WCDS was able to reduce the microclimate temperature inside the PPE averagely by about 8°C for 60 minutes when the environmental temperature was 28.0 °C and 33.5 °C, respectively. Replacing the ice cooling unit every hour can maintain this cooling effect, while the longest operation duration is determined by the battery of the blowers, which can last for about 6 hours. This unique design is especially helpful for the PPE users, such as health care workers in infectious and hot environments when continuous cooling and dehumidifying are needed, but the change of protective clothing may increase the risk of infection. The new WCDS will not only improve the thermal comfort of PPE users but can also extend their safe working duration.Keywords: personal thermal management, heat stress, ppe, health care workers, wearable device
Procedia PDF Downloads 791932 Autonomous Vehicle Detection and Classification in High Resolution Satellite Imagery
Authors: Ali J. Ghandour, Houssam A. Krayem, Abedelkarim A. Jezzini
Abstract:
High-resolution satellite images and remote sensing can provide global information in a fast way compared to traditional methods of data collection. Under such high resolution, a road is not a thin line anymore. Objects such as cars and trees are easily identifiable. Automatic vehicles enumeration can be considered one of the most important applications in traffic management. In this paper, autonomous vehicle detection and classification approach in highway environment is proposed. This approach consists mainly of three stages: (i) first, a set of preprocessing operations are applied including soil, vegetation, water suppression. (ii) Then, road networks detection and delineation is implemented using built-up area index, followed by several morphological operations. This step plays an important role in increasing the overall detection accuracy since vehicles candidates are objects contained within the road networks only. (iii) Multi-level Otsu segmentation is implemented in the last stage, resulting in vehicle detection and classification, where detected vehicles are classified into cars and trucks. Accuracy assessment analysis is conducted over different study areas to show the great efficiency of the proposed method, especially in highway environment.Keywords: remote sensing, object identification, vehicle and road extraction, vehicle and road features-based classification
Procedia PDF Downloads 2321931 A Decision Support System to Detect the Lumbar Disc Disease on the Basis of Clinical MRI
Authors: Yavuz Unal, Kemal Polat, H. Erdinc Kocer
Abstract:
In this study, a decision support system comprising three stages has been proposed to detect the disc abnormalities of the lumbar region. In the first stage named the feature extraction, T2-weighted sagittal and axial Magnetic Resonance Images (MRI) were taken from 55 people and then 27 appearance and shape features were acquired from both sagittal and transverse images. In the second stage named the feature weighting process, k-means clustering based feature weighting (KMCBFW) proposed by Gunes et al. Finally, in the third stage named the classification process, the classifier algorithms including multi-layer perceptron (MLP- neural network), support vector machine (SVM), Naïve Bayes, and decision tree have been used to classify whether the subject has lumbar disc or not. In order to test the performance of the proposed method, the classification accuracy (%), sensitivity, specificity, precision, recall, f-measure, kappa value, and computation times have been used. The best hybrid model is the combination of k-means clustering based feature weighting and decision tree in the detecting of lumbar disc disease based on both sagittal and axial MR images.Keywords: lumbar disc abnormality, lumbar MRI, lumbar spine, hybrid models, hybrid features, k-means clustering based feature weighting
Procedia PDF Downloads 5201930 Methods of Variance Estimation in Two-Phase Sampling
Authors: Raghunath Arnab
Abstract:
The two-phase sampling which is also known as double sampling was introduced in 1938. In two-phase sampling, samples are selected in phases. In the first phase, a relatively large sample of size is selected by some suitable sampling design and only information on the auxiliary variable is collected. During the second phase, a sample of size is selected either from, the sample selected in the first phase or from the entire population by using a suitable sampling design and information regarding the study and auxiliary variable is collected. Evidently, two phase sampling is useful if the auxiliary information is relatively easy and cheaper to collect than the study variable as well as if the strength of the relationship between the variables and is high. If the sample is selected in more than two phases, the resulting sampling design is called a multi-phase sampling. In this article we will consider how one can use data collected at the first phase sampling at the stages of estimation of the parameter, stratification, selection of sample and their combinations in the second phase in a unified setup applicable to any sampling design and wider classes of estimators. The problem of the estimation of variance will also be considered. The variance of estimator is essential for estimating precision of the survey estimates, calculation of confidence intervals, determination of the optimal sample sizes and for testing of hypotheses amongst others. Although, the variance is a non-negative quantity but its estimators may not be non-negative. If the estimator of variance is negative, then it cannot be used for estimation of confidence intervals, testing of hypothesis or measure of sampling error. The non-negativity properties of the variance estimators will also be studied in details.Keywords: auxiliary information, two-phase sampling, varying probability sampling, unbiased estimators
Procedia PDF Downloads 5881929 On Cloud Computing: A Review of the Features
Authors: Assem Abdel Hamed Mousa
Abstract:
The Internet of Things probably already influences your life. And if it doesn’t, it soon will, say computer scientists; Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing. Ubiquitous computing is essentially the term for human interaction with computers in virtually everything. Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences. The approach: Activate the world. Provide hundreds of wireless computing devices per person per office, of all scales (from 1" displays to wall sized). This has required new work in operating systems, user interfaces, networks, wireless, displays, and many other areas. We call our work "ubiquitous computing". This is different from PDA's, dynabooks, or information at your fingertips. It is invisible; everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere. The initial incarnation of ubiquitous computing was in the form of "tabs", "pads", and "boards" built at Xerox PARC, 1988-1994. Several papers describe this work, and there are web pages for the Tabs and for the Boards (which are a commercial product now): Ubiquitous computing will drastically reduce the cost of digital devices and tasks for the average consumer. With labor intensive components such as processors and hard drives stored in the remote data centers powering the cloud , and with pooled resources giving individual consumers the benefits of economies of scale, monthly fees similar to a cable bill for services that feed into a consumer’s phone.Keywords: internet, cloud computing, ubiquitous computing, big data
Procedia PDF Downloads 3821928 Postmortem Magnetic Resonance Imaging as an Objective Method for the Differential Diagnosis of a Stillborn and a Neonatal Death
Authors: Uliana N. Tumanova, Sergey M. Voevodin, Veronica A. Sinitsyna, Alexandr I. Shchegolev
Abstract:
An important part of forensic and autopsy research in perinatology is the answer to the question of life and stillbirth. Postmortem magnetic resonance imaging (MRI) is an objective non-invasive research method that allows to store data for a long time and not to exhume the body to clarify the diagnosis. The purpose of the research is to study the possibilities of a postmortem MRI to determine the stillbirth and death of a newborn who had spontaneous breathing and died on the first day after birth. MRI and morphological data of a study of 23 stillborn bodies, prenatally dead at a gestational age of 22-39 weeks (Group I) and the bodies of 16 newborns who died from 2 to 24 hours after birth (Group II) were compared. Before the autopsy, postmortem MRI was performed on the Siemens Magnetom Verio 3T device in the supine position of the body. The control group for MRI studies consisted of 7 live newborns without lung disease (Group III). On T2WI in the sagittal projection was measured MR-signal intensity (SI) in the lung tissue (L) and shoulder muscle (M). During the autopsy, a pulmonary swimming test was evaluated, and macro- and microscopic studies were performed. According to the postmortem MRI, the highest values of mean SI of the lung (430 ± 27.99) and of the muscle (405.5 ± 38.62) on T2WI were detected in group I and exceeded the corresponding value of group II by 2.7 times. The lowest values were found in the control group - 77.9 ± 12.34 and 119.7 ± 6.3, respectively. In the group II, the lung SI was 1.6 times higher than the muscle SI, whereas in the group I and in the control group, the muscle SI was 2.1 times and 1.8 times larger than the lung. On the basis of clinical and morphological data, we calculated the formula for determining the breathing index (BI) during postmortem MRI: BI = SIL x SIM / 100. The mean value of BI in the group I (1801.14 ± 241.6) (values ranged from 756 to 3744) significantly higher than the corresponding average value of BI in the group II (455.89 ± 137.32, p < 0.05) (305-638.4). In the control group, the mean BI value was 91.75 ± 13.3 (values ranged from 53 to 154). The BI with the results of pulmonary swimming tests and microscopic examination of the lungs were compared. The boundary value of BI for the differential diagnosis of stillborn and newborn death was 700. Using the postmortem MRI allows to differentiate the stillborn with the death of the breathing newborn.Keywords: lung, newborn, postmortem MRI, stillborn
Procedia PDF Downloads 1281927 Influence of Solenoid Configuration on Electromagnetic Acceleration of Plunger
Authors: Shreyansh Bharadwaj, Raghavendra Kollipara, Sijoy C. D., R. K. Mittal
Abstract:
Utilizing the Lorentz force to propel an electrically conductive plunger through a solenoid represents a fundamental application in electromagnetism. The parameters of the solenoid significantly influence the force exerted on the plunger, impacting its response. A parametric study has been done to understand the effect of these parameters on the force acting on the plunger. This study is done to determine the most optimal combination of parameters to obtain the fast response. Analysis has been carried out using an algorithm capable of simulating the scenario of a plunger undergoing acceleration within a solenoid. Authors have conducted an analysis focusing on several key configuration parameters of the solenoid. These parameters include the inter-layer gap (in the case of a multi-turn solenoid), different conductor diameters, varying numbers of turns, and diverse numbers of layers. Primary objective of this paper is to discern how alterations in these parameters affect the force applied to the plunger. Through extensive numerical simulations, a dataset has been generated and utilized to construct informative plots. These plots provide visual representations of the relationships between the solenoid configuration parameters and the resulting force exerted on the plunger, which can further be used to deduce scaling laws. This research endeavors to offer valuable insights into optimizing solenoid configurations for enhanced electromagnetic acceleration, thereby contributing to advancements in electromagnetic propulsion technology.Keywords: Lorentz force, solenoid configuration, electromagnetic acceleration, parametric analysis, simulation
Procedia PDF Downloads 471926 Higher Education Benefits and Undocumented Students: An Explanatory Model of Policy Adoption
Authors: Jeremy Ritchey
Abstract:
Undocumented immigrants in the U.S. face many challenges when looking to progress in society, especially when pursuing post-secondary education. The majority of research done on state-level policy adoption pertaining to undocumented higher-education pursuits, specifically in-state resident tuition and financial aid eligibility policies, have framed the discussion on the potential and actual impacts which implementation can and has achieved. What is missing is a model to view the social, political and demographic landscapes upon which such policies (in their various forms) find a route to legislative enactment. This research looks to address this gap in the field by investigating the correlations and significant state-level variables which can be operationalized to construct a framework for adoption of these specific policies. In the process, analysis will show that past unexamined conceptualizations of how such policies come to fruition may be limited or contradictory when compared to available data. Circling on the principles of Policy Innovation and Policy Diffusion theory, this study looks to use variables collected via Michigan State University’s Correlates of State Policy Project, a collectively and ongoing compiled database project centered around annual variables (1900-2016) collected from all 50 states relevant to policy research. Using established variable groupings (demographic, political, social capital measurements, and educational system measurements) from the time period of 2000 to 2014 (2001 being when such policies began), one can see how this data correlates with the adoption of policies related to undocumented students and in-state college tuition. After regression analysis, the results will illuminate which variables appears significant and to what effect, as to help formulate a model upon which to explain when adoption appears to occur and when it does not. Early results have shown that traditionally held conceptions on conservative and liberal identities of the state, as they relate to the likelihood of such policies being adopted, did not fall in line with the collected data. Democratic and liberally identified states were, overall, less likely to adopt pro-undocumented higher education policies than Republican and conservatively identified states and vis versa. While further analysis is needed as to improve the model’s explanatory power, preliminary findings are showing promise in widening our understanding of policy adoption factors in this realm of policies compared to the gap of such knowledge in the publications of the field as it currently exists. The model also looks to serve as an important tool for policymakers in framing such potential policies in a way that is congruent with the relevant state-level determining factors while being sensitive to the most apparent sources of potential friction. While additional variable groups and individual variables will ultimately need to be added and controlled for, this research has already begun to demonstrate how shallow or unexamined reasoning behind policy adoption in the realm of this topic needs to be addressed or else the risk is erroneous conceptions leaking into the foundation of this growing and ever important field.Keywords: policy adoption, in-state tuition, higher education, undocumented immigrants
Procedia PDF Downloads 1151925 An In-Depth Definition of the 24 Levels of Consciousness and Its Relationship to Buddhism and Artificial Intelligence
Authors: James V. Luisi
Abstract:
Understanding consciousness requires a synthesis of ideas from multiple disciplines, including obvious ones like psychology, biology, evolution, neurology, and neuroscience, as well as less obvious ones like protozoology, botany, entomology, carcinology, herpetology, mammalogy, and computer sciences. Furthermore, to incorporate the necessary backdrop, it is best presented in a theme of Eastern philosophy, specifically leveraging the teachings of Buddhism for its relevance to early thought on consciousness. These ideas are presented as a multi-level framework that illustrates the various aspects of consciousness within a tapestry of foundational and dependent building blocks as to how living organisms evolved to understand elements of their reality sufficiently to survive, and in the case of Homo sapiens, eventually move beyond meeting the basic needs of survival, but to also achieve survival of the species beyond the eventual fate of our planet. This is not a complete system of thought, but just a framework of consciousness gathering some of the key elements regarding the evolution of consciousness and the advent of free will, and presenting them in a unique way that encourages readers to continue the dialog and thought process as an experience to enjoy long after reading the last page. Readers are encouraged to think for themselves about the issues raised herein and to question every facet presented, as much further exploration is needed. Needless to say, this subject will remain a rapidly evolving one for quite some time to come, and it is probably in the interests of everyone to at least consider attaining both an ability and willingness to participate in the dialog.Keywords: consciousness, sentience, intelligence, artificial intelligence, Buddhism
Procedia PDF Downloads 1081924 Evaluating Machine Learning Techniques for Activity Classification in Smart Home Environments
Authors: Talal Alshammari, Nasser Alshammari, Mohamed Sedky, Chris Howard
Abstract:
With the widespread adoption of the Internet-connected devices, and with the prevalence of the Internet of Things (IoT) applications, there is an increased interest in machine learning techniques that can provide useful and interesting services in the smart home domain. The areas that machine learning techniques can help advance are varied and ever-evolving. Classifying smart home inhabitants’ Activities of Daily Living (ADLs), is one prominent example. The ability of machine learning technique to find meaningful spatio-temporal relations of high-dimensional data is an important requirement as well. This paper presents a comparative evaluation of state-of-the-art machine learning techniques to classify ADLs in the smart home domain. Forty-two synthetic datasets and two real-world datasets with multiple inhabitants are used to evaluate and compare the performance of the identified machine learning techniques. Our results show significant performance differences between the evaluated techniques. Such as AdaBoost, Cortical Learning Algorithm (CLA), Decision Trees, Hidden Markov Model (HMM), Multi-layer Perceptron (MLP), Structured Perceptron and Support Vector Machines (SVM). Overall, neural network based techniques have shown superiority over the other tested techniques.Keywords: activities of daily living, classification, internet of things, machine learning, prediction, smart home
Procedia PDF Downloads 3571923 Biopsy or Biomarkers: Which Is the Sample of Choice in Assessment of Liver Fibrosis?
Authors: S. H. Atef, N. H. Mahmoud, S. Abdrahman, A. Fattoh
Abstract:
Background: The aim of the study is to assess the diagnostic value of fibrotest and hyaluronic acid in discriminate between insignificant and significant fibrosis. Also, to find out if these parameters could replace liver biopsy which is currently used for selection of chronic hepatitis C patients eligible for antiviral therapy. Study design: This study was conducted on 52 patients with HCV RNA detected by polymerase chain reaction (PCR) who had undergone liver biopsy and attending the internal medicine clinic at Ain Shams University Hospital. Liver fibrosis was evaluated according to the METAVIR scoring system on a scale of F0 to F4. Biochemical markers assessed were: alpha-2 macroglobulin (α2-MG), apolipoprotein A1 (Apo-A1), haptoglobin, gamma-glutamyl transferase (GGT), total bilirubin (TB) and hyaluronic acid (HA). The fibrotest score was computed after adjusting for age and gender. Predictive values and ROC curves were used to assess the accuracy of fibrotest and HA results. Results: For fibrotest, the observed area under curve for the discrimination between minimal or no fibrosis (F0-F1) and significant fibrosis (F2-F4) was 0.6736 for cutoff value 0.19 with sensitivity of 84.2% and specificity of 85.7%. For HA, the sensitivity was 89.5% and specificity was 85.7% and area under curve was 0.540 at the best cutoff value 71 mg/dL. Multi-use of both parameters, HA at 71 mg/dL with fibrotest score at 0.22 give a sensitivity 89.5%, specificity 100 and efficacy 92.3% (AUC 0.895). Conclusion: The use of both fibrotest score and HA could be as alternative to biopsy in most patients with chronic hepaitis C putting in consideration some limitations of the proposed markers in evaluating liver fibrosis.Keywords: fibrotest, liver fibrosis, HCV RNA, biochemical markers
Procedia PDF Downloads 2871922 A Brief Exploration on the Green Urban Design for Carbon Neutrality
Authors: Gaoyuan Wang, Tian Chen
Abstract:
China’s emission peak and carbon neutrality strategies lead to the transformation of development patterns and call for new green urban design thinking. This paper begins by revealing the evolution of green urban design thinking during the periods of carbon enlightenment, carbon dependency, and carbon decoupling from the perspective of the energy transition. Combined with the current energy situation, national strengths, and technological trends, the emergence of green urban design towards carbon neutrality becomes inevitable. Based on the preliminary analysis of its connotation, the characteristics of the new type of green urban design are generalized as low-carbon orientation, carbon-related objects, carbon-reduction means, and carbon-control patterns. Its theory is briefly clarified in terms of the human-earth synergism, quality-energy interconnection, and form-flow interpromotion. Then, its mechanism is analyzed combined with the core tasks of carbon neutrality, and the scope of design issues is defined, including carbon flow mapping, carbon source regulation, carbon sink construction, and carbon emission management. Finally, a multi-scale spatial response system is proposed across the region, city, cluster, and neighborhood level. The discussion aims to provide support for the innovation of green urban design theories and methods in the context of peak neutrality.Keywords: carbon neutrality, green urban design, energy transition, theoretical exploration
Procedia PDF Downloads 1751921 The Cartometric-Geographical Analysis of Ivane Javakhishvili 1922: The Map of the Republic of Georgia
Authors: Manana Kvetenadze, Dali Nikolaishvili
Abstract:
The study revealed the territorial changes of Georgia before the Soviet and Post-Soviet periods. This includes the estimation of the country's borders, its administrative-territorial arrangement change as well as the establishment of territorial losses. Georgia’s old and new borders marked on the map are of great interest. The new boundary shows the condition of 1922 year, following the Soviet period. Neither on this map nor in other works Ivane Javakhishvili talks about what he implies in the old borders, though it is evident that this is the Pre-Soviet boundary until 1921 – i.e., before the period when historical Tao, Zaqatala, Lore, Karaia represented the parts of Georgia. According to cartometric-geographical terms, the work presents detailed analysis of Georgia’s borders, along with this the comparison of research results has been carried out: 1) At the boundary line on Soviet topographic maps, the maps of 100,000; 50,000 and 25,000 scales are used; 2) According to Ivane Javakhishvili’s work ('The borders of Georgia in terms of historical and contemporary issues'). During that research, we used multi-disciplined methodology and software. We used Arc GIS for Georeferencing maps, and after that, we compare all post-Soviet Union maps, in order to determine how the borders have changed. During this work, we also use many historical data. The features of the spatial distribution of the territorial administrative units of Georgia, as well as the distribution of administrative-territorial units of the objects depicted on the map, have been established. The results obtained are presented in the forms of thematic maps and diagrams.Keywords: border, GIS, georgia, historical cartography, old maps
Procedia PDF Downloads 2421920 Nano-Bioremediation of Contaminated Industrial Wastewater Using Biosynthesized AgNPs and Their Nano-Composite
Authors: Osama M. Darwesh, Sahar H. Hassan, Abd El-Raheem R. El-Shanshoury, Shawky Z. Sabae
Abstract:
Nanotechnology as multidisciplinary technology is growing rapidly with important applications in several sectors. Also, nanobiotechnology is known for the use of microorganisms for the synthesis of targeted nanoparticles. The present study deals with the green synthesis of silver nanoparticles using aquatic bacteria and the development of a biogenic nanocomposite for environmental applications. Twenty morphologically different colonies were isolated from the collected water samples from eight different locations at the Rosetta branch of the Nile Delta, Egypt. The obtained results illustrated that the most effective bacterial isolate (produced the higher amount of AgNPs after 24 h of incubation time) is isolate R3. Bacillus tequilensis was the strongest extracellular bio-manufactory of AgNPs. Biosynthesized nanoparticles had a spherical shape with a mean diameter of 2.74 to 28.4 nm. The antimicrobial activity of silver nanoparticles against many pathogenic microbes indicated that the produced AgNPs had high activity against all tested multi-antibiotic resistant pathogens. Also, the stabilized prepared AgNPs-SA nanocomposite has greater catalytic activity for the decolourization of some dyes like Methylene blue (MB) and Crystal violet. Such results represent a promising stage for producing eco-friendly, cost-effective, and easy-to-handle devices for the bioremediation of contaminated industrial wastewater.Keywords: bioremediation, AgNPs, AgNPs-SA nanocomposite, Bacillus tequilensis, nanobiotechnology
Procedia PDF Downloads 681919 RV-YOLOX: Object Detection on Inland Waterways Based on Optimized YOLOX Through Fusion of Vision and 3+1D Millimeter Wave Radar
Authors: Zixian Zhang, Shanliang Yao, Zile Huang, Zhaodong Wu, Xiaohui Zhu, Yong Yue, Jieming Ma
Abstract:
Unmanned Surface Vehicles (USVs) are valuable due to their ability to perform dangerous and time-consuming tasks on the water. Object detection tasks are significant in these applications. However, inherent challenges, such as the complex distribution of obstacles, reflections from shore structures, water surface fog, etc., hinder the performance of object detection of USVs. To address these problems, this paper provides a fusion method for USVs to effectively detect objects in the inland surface environment, utilizing vision sensors and 3+1D Millimeter-wave radar. MMW radar is complementary to vision sensors, providing robust environmental information. The radar 3D point cloud is transferred to 2D radar pseudo image to unify radar and vision information format by utilizing the point transformer. We propose a multi-source object detection network (RV-YOLOX )based on radar-vision fusion for inland waterways environment. The performance is evaluated on our self-recording waterways dataset. Compared with the YOLOX network, our fusion network significantly improves detection accuracy, especially for objects with bad light conditions.Keywords: inland waterways, YOLO, sensor fusion, self-attention
Procedia PDF Downloads 1241918 Application of Electrochemically Prepared PPy/MWCNT:MnO2 Nano-Composite Film in Microbial Fuel Cells for Sustainable Power Generation
Authors: Rajeev jain, D. C. Tiwari, Praveena Mishra
Abstract:
Nano-composite of polypyrrole/multiwalled carbon nanotubes:mangenese oxide (PPy/MWCNT:MnO2) was electrochemically deposited on the surface of carbon cloth (CC). The nano-composite was structurally characterized by FTIR, SEM, TEM and UV-Vis studies. Nano-composite was also characterized by cyclic voltammetry (CV), current voltage measurements (I-V) and the optical band gaps of film were evaluated from UV-Vis absorption studies. The PPy/MWCNT:MnO2 nano-composite was used as anode in microbial fuel cell (MFC) for sewage waste water treatment, power and coulombic efficiency measurement. The prepared electrode showed good electrical conductivity (0.1185 S m-1). This was also supported by band gap measurements (direct 0.8 eV, indirect 1.3 eV). The obtained maximum power density was 1125.4 mW m-2, highest chemical oxygen demand (COD) removal efficiency was 93% and the maximum coulombic efficiency was 59%. For the first time PPy/MWCNT:MnO2 nano-composite for MFC prepared from nano-composite electrode having the potential for the use in MFC with good stability and better adhesion of microbes is being reported. The SEM images confirm the growth and development of microbe’s colony.Keywords: carbon cloth, electro-polymerization, functionalization, microbial fuel cells, multi walled carbon nanotubes, polypyrrole
Procedia PDF Downloads 2711917 The Numerical and Experimental Analysis of Compressed Composite Plate in Asymmetrical Arrangement of Layers
Authors: Katarzyna Falkowicz
Abstract:
The work focused on the original concept of a thin-walled plate element with a cut-out, for use as a spring or load-bearing element. The subject of the study were rectangular plates with a cut-out with variable geometrical parameters and with a variable angle of fiber arrangement, made of a carbon-epoxy composite with high strength properties in an asymmetrical arrangement, subjected to uniform compression. The influence of geometrical parameters of the cut-out and the angle of fiber arrangement on the value of critical load of the structure and buckling form was investigated. Uniform thin plates are relatively cheap to manufacture, however due to their low bending stiffness; they can carry relatively small loads. The lowest form of loss of plate stability, which is the bending form, leads to its rapid destruction due to high deflection increases, with a slight increase in compressive load - low rigidity of the structure. However, the stiffness characteristics of the structure change significantly when the work of plate is forcing according to the higher flexural-torsional form of buckling. The plate is able to carry a much higher compressive load while maintaining much stiffer work characteristics in the post-critical range. The calculations carried out earlier show that plates with forced higher form of buckling are characterized by stable, progressive paths of post-critical equilibrium, enabling their use as elastic elements. The characteristics of such elements can be designed in a wide range by changing the geometrical parameters of the cut-out, i.e. height and width as well as by changing the angle of fiber arrangement The commercial ABAQUS program using the finite element method was used to develop the discrete model and perform numerical calculations. The obtained results are of significant practical importance in the design of structures with elastic elements, allowing to achieve the required maintenance characteristics of the device.Keywords: buckling mode, numerical method, unsymmetrical laminates, thin-walled elastic elements
Procedia PDF Downloads 1051916 Mobility-Aware Relay Selection in Two Hop Unmanned Aerial Vehicles Network
Authors: Tayyaba Hussain, Sobia Jangsher, Saqib Ali, Saqib Ejaz
Abstract:
Unmanned Aerial vehicles (UAV’s) have gained great popularity due to their remoteness, ease of deployment and high maneuverability in different applications like real-time surveillance, image capturing, weather atmospheric studies, disaster site monitoring and mapping. These applications can involve a real-time communication with the ground station. However, altitude and mobility possess a few challenges for the communication. UAV’s at high altitude usually require more transmit power. One possible solution can be with the use of multi hops (UAV’s acting as relays) and exploiting the mobility pattern of the UAV’s. In this paper, we studied a relay (UAV’s acting as relays) selection for a reliable transmission to a destination UAV. We exploit the mobility information of the UAV’s to propose a Mobility-Aware Relay Selection (MARS) algorithm with the objective of giving improved data rates. The results are compared with Non Mobility-Aware relay selection scheme and optimal values. Numerical results show that our proposed MARS algorithm gives 6% better achievable data rates for the mobile UAV’s as compared with Non MobilityAware relay selection scheme. On average a decrease of 20.2% in data rate is achieved with MARS as compared with SDP solver in Yalmip.Keywords: mobility aware, relay selection, time division multiple acess, unmanned aerial vehicle
Procedia PDF Downloads 2381915 Impact of Information and Communication Technology on Academic Performance of Senior Secondary Schools Students in Gwagwalada Area Council of Federal Capital Territory, Abuja
Authors: Suleiman Garba, Haruna Ishaku
Abstract:
Information and communication technology (ICT) includes any communication device encompassing: radio, television, cellular phones, computer, satellite systems and so on, as well as the various services and applications associated with them. The significance of ICT cannot be over-emphasized in education. The teaching and learning processes have integrated with the application of ICTs for effectiveness and enhancement of academic performance among the students. Today, as the educational sector is faced with series of changes and reforms, it was noted that the problem of information technology illiteracy was a serious one among the schools’ teachers in the country as it cuts across primary, secondary schools and tertiary institutions. This study investigated the impact of ICT on the academic performance of senior secondary schools students in Gwagwalada Area Council of Federal Capital Territory (FCT), Abuja. A sample of 120 SSS III students was involved in the study. They were selected by using simple random sampling technique. A questionnaire was developed and validated through expert judgement and reliability co-efficient of 0.81 was obtained. It was used to gather relevant data from the respondents. Findings revealed that there was positive impact of ICT on academic performance of senior secondary schools students. The findings indicated the causes of poor academic performance among the students as lack of qualified teachers to teach in schools, peer group influence, and bullying. Significantly, the findings revealed that ICT had a positive impact on students’ academic performance. The null hypotheses were tested using t-test at 0.05 level of significance. It was discovered that there was significant difference between male and female secondary schools’ students' impact of ICT on academic performance in Gwagawalada Area Council of FCT-Abuja. Based on these findings, some recommendations were made which include: adequate funds should be provided towards procurement of ICT resources, relevant textbooks to enhance students’ active participation in learning processes and students should be provided with internet accessibility at inexpensive rate so as to create a platform for accessing useful information in the pursuit of academic excellence.Keywords: academic performance, impact, information communication technology, schools, students
Procedia PDF Downloads 2191914 Language Rights and the Challenge of National Integration: The Nigerian Experience
Authors: Odewumi Olatunde, Adegun Sunday
Abstract:
Linguistic diversity is seen to complicate attempts to build a stable and cohesive political community. Hence, the challenge of integration is enormous in a multi-ethno-lingual country like Nigeria. In the same vein, justification for minority language rights claims in relation to broader political theories of justice, freedom and democracy cannot be ignored. It is in the light of the fore-going that this paper explores Nigeria’s experiments at language policy and planning(LPP) and the long drawn agitations for self-determination and linguistic freedom by the minority ethnic groups in the polity which has been exacerbated by the National Policy on Education language provisions. The paper succinctly reviews Nigeria’s LPP efforts and its attendant theater of conflicts; explores international attempts at evolving normative principles of freedom and equality for language policy and finally evaluates the position of the Nigerian LPP in the light of evolving international conventions. On this premise, it is concluded that giving a conscientious and honest implementation of the Nigerian language provisions as assessed from their face validity, the nation’s efforts could be exonerated from running afoul of any known civilized values and best practices. It is, therefore, recommended that an effectual and consistent commitment to implementation driven by a renewed political will is what is required for the nation to succeed in this direction.Keywords: integration, rights, challenge, conventions, policy
Procedia PDF Downloads 4141913 Analysis of Standard Tramway Surge Protection Methods Based on Real Cases
Authors: Alain Rousseau, Alfred Aragones, Gilles Rougier
Abstract:
The study is based on lightning and surge standards mainly the EN series 62305 for facility protection, EN series 61643 for Low Voltage Surge Protective Devices, High Voltage surge arrester standard en 60099-4 and the traction arrester standards namely EN 50526-1 and 50526-1 dealing respectively with railway applications fixed installations D.C. surge arresters and voltage limiting devices. The more severe stress for tramways installations is caused by direct lightning on the catenary line. In such case, the surge current propagates towards the various poles and sparkover the insulators leading to a lower stress. If the impact point is near enough, a significant surge current will flow towards the traction surge arrester that is installed on the catenary at the location the substation is connected. Another surge arrester can be installed at the entrance of the substation or even inside the rectifier to avoid insulation damages. In addition, surge arresters can be installed between + and – to avoid damaging sensitive circuits. Based on disturbances encountered in a substation following a lighting event, the engineering department of RATP has decided to investigate the cause of such damage and more generally to question the efficiency of the various possible protection means. Based on the example of a recent tramway line the paper present the result of a lightning study based on direct lightning strikes. As a matter of fact, the induced surges on the catenary are much more frequent but much less damaging. First, a lightning risk assessment is performed for the substations that takes into account direct lightning and induced lightning both on the substation and its connected lines such as the catenary. Then the paper deals with efficiency of the various surge arresters is discussed based on field experience and calculations. The efficiency of the earthing system used at the bottom of the pole is also addressed based on high frequency earthing measurement. As a conclusion, the paper is making recommendations for an enhanced efficiency of existing protection means.Keywords: surge arrester, traction, lightning, risk, surge protective device
Procedia PDF Downloads 2591912 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 1941911 Pain Control by Ketamine in Combat Situation; Consideration and Outcomes
Authors: Mohammad Javad Behzadnia, Hamidreza Javadzadeh
Abstract:
Background: Pain management is essential to surmounting multi-injured people in an overcrowded emergency setting. Its role would be more apparent when the physician encounters a mass casualty in a war zone or even a military prehospital. Having sedative and analgesic properties, rapid onset and offset effects, and maintaining the cardiovascular and respiratory contain are the main reason for selecting Ketamine as a good choice in the war zone. Methods: In a prospective interventional study in a war zone, we have selected and followed two groups of casualties for pain management. All were men with an average age of 26.6±8 y/o and 27.5 ±7 y/o in A and B groups, respectively. Group A received only Ketamine and Group B received Ketamine and diazepam. Results: This study showed that all of the injured patients who received Ketamine had experienced some agitation, and they may finally need benzodiazepines for sedation, but in group B that received benzodiazepine before or simultaneous with Ketamine, the agitation was significantly reduced. (P Value ≤0.05) Conclusion: Various factors may affect pain score and perception; patients' culture, mental health, previous drug usage, and addiction could alter the pain score in similar situations. It seems that the significant agitation is due to catecholamine release in stressful Moments of the battlefield. Accordingly, this situation could be exacerbated due to ketamine properties. Nonetheless, as a good choice in the war zone, Ketamine is now recommended to combine with benzodiazepines for procedural sedation and analgesia (PSA).Keywords: battlefield, ketamine, benzodiazepine, pain control
Procedia PDF Downloads 1021910 Use of Alternative Water Sources Based on a Rainwater in the Multi-Dwelling Urban Building 2030
Authors: Monika Lipska
Abstract:
Drinking water is water with a very high quality, and as such represents only 2.5% of the total quantity of all water in the world. For many years we have observed continuous increase in its consumption as a result of many factors such as: Growing world population (7 billion in 2011r.), increase of human lives comfort and – above all – the economic growth. Due to the rocketing consumption and growing costs of production of water with such high-quality parameters, we experience accelerating interest in alternative sources of obtaining potable water. One of the ways of saving this valuable material is using rainwater in the Urban Building. With an exponentially growing demand, the acquisition of additional sources of water is necessary to maintain the proper balance of all ecosystems. The first part of the paper describes what rainwater is and what are its potential sources and means of use, while the main part of the article focuses on the description of the methods of obtaining water from rain on the example of new urban building in Poland. It describes the method and installations of rainwater in the new urban building (“MBJ2030”). The paper addresses also the issue of monitoring of the whole recycling systems as well as the particular quality indicators important because of identification of the potential risks to human health. The third part describes the legal arrangements concerning the recycling of rainwater existing in different European Union countries with particular reference to Poland on example the new urban building in Warsaw.Keywords: rainwater, potable water, non-potable water, Poland
Procedia PDF Downloads 4141909 Using Squeezed Vacuum States to Enhance the Sensitivity of Ground Based Gravitational Wave Interferometers beyond the Standard Quantum Limit
Authors: Giacomo Ciani
Abstract:
This paper reviews the impact of quantum noise on modern gravitational wave interferometers and explains how squeezed vacuum states are used to push the noise below the standard quantum limit. With the first detection of gravitational waves from a pair of colliding black holes in September 2015 and subsequent detections including that of gravitational waves from a pair of colliding neutron stars, the ground-based interferometric gravitational wave observatories LIGO and VIRGO have opened the era of gravitational-wave and multi-messenger astronomy. Improving the sensitivity of the detectors is of paramount importance to increase the number and quality of the detections, fully exploiting this new information channel about the universe. Although still in the commissioning phase and not at nominal sensitivity, these interferometers are designed to be ultimately limited by a combination of shot noise and quantum radiation pressure noise, which define an envelope known as the standard quantum limit. Despite the name, this limit can be beaten with the use of advanced quantum measurement techniques, with the use of squeezed vacuum states being currently the most mature and promising. Different strategies for implementation of the technology in the large-scale detectors, in both their frequency-independent and frequency-dependent variations, are presented, together with an analysis of the main technological issues and expected sensitivity gain.Keywords: gravitational waves, interferometers, squeezed vacuum, standard quantum limit
Procedia PDF Downloads 1511908 Naturalization of Aliens in Consideration of Turkish Constitutional Law: Recent Governmental Practices
Authors: Zeynep Ozkan, Cigdem Serra Uzunpinar
Abstract:
Citizenship is a legal bond that binds a person to a certain state. How constitutions define ‘the citizen’ and how they regulate the elements of citizenship have great importance in terms of individuals’ duties before the state as well as the rights they own. Especially in multi-segmented societies that contain foreign elements, it becomes necessary to examinate the institution of naturalization in terms of individuals’ duty of constitutional citizenship. The meaning of citizenship in Turkey has transformed due to the changes in practices of naturalization, in parallel to receiving huge amount of immagrants with the recent Syrian Crisis, the change in the governmental system and facing economic crisis. This transformation took place in the way of a diversion from the states’ initial motive of building the bond of citizenship with the aim of founding/sustaining political unity. Hence, rising of the economic and political motives in naturalization practices are in question, instead of objective and subjective criterias, that are traditionally used on defining the notion of nation. In this study, firstly the regime of citizenship and the legal regime of aliens in Turkish legislation will be given place. Then, the transformation, that the notion of constitutional citizenship underwent, will be studied, especially on the basis of governmental practices of naturalization. The assessment will be made in the context of legal institutions brought with the new governmental system as a result of recent constitutional amendment.Keywords: constitutional citizenship, naturalization, naturalization practices in Turkish legal system, transformation of the notion of constitutional citizenship
Procedia PDF Downloads 1181907 Cybernetic Modeling of Growth Dynamics of Debaryomyces nepalensis NCYC 3413 and Xylitol Production in Batch Reactor
Authors: J. Sharon Mano Pappu, Sathyanarayana N. Gummadi
Abstract:
Growth of Debaryomyces nepalensis on mixed substrates in batch culture follows diauxic pattern of completely utilizing glucose during the first exponential growth phase, followed by an intermediate lag phase and a second exponential growth phase consuming xylose. The present study deals with the development of cybernetic mathematical model for prediction of xylitol production and yield. Production of xylitol from xylose in batch fermentation is investigated in the presence of glucose as the co-substrate. Different ratios of glucose and xylose concentrations are assessed to study the impact of multi substrate on production of xylitol in batch reactors. The parameters in the model equations were estimated from experimental observations using integral method. The model equations were solved simultaneously by numerical technique using MATLAB. The developed cybernetic model of xylose fermentation in the presence of a co-substrate can provide answers about how the ratio of glucose to xylose influences the yield and rate of production of xylitol. This model is expected to accurately predict the growth of microorganism on mixed substrate, duration of intermediate lag phase, consumption of substrate, production of xylitol. The model developed based on cybernetic modelling framework can be helpful to simulate the dynamic competition between the metabolic pathways.Keywords: co-substrate, cybernetic model, diauxic growth, xylose, xylitol
Procedia PDF Downloads 328