Search results for: unified commensurate multiple
4339 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 1404338 The Impact of Missense Mutation in Phosphatidylinositol Glycan Class A Associated to Paroxysmal Nocturnal Hemoglobinuria and Multiple Congenital Anomalies-Hypotonia-Seizures Syndrome 2: A Computational Study
Authors: Ashish Kumar Agrahari, Amit Kumar
Abstract:
Paroxysmal nocturnal hemoglobinuria (PNH) is an acquired clonal blood disorder that manifests with hemolytic anemia, thrombosis, and peripheral blood cytopenias. The disease is caused by the deficiency of two glycosylphosphatidylinositols (GPI)-anchored proteins (CD55 and CD59) in the hemopoietic stem cells. The deficiency of GPI-anchored proteins has been associated with the somatic mutations in phosphatidylinositol glycan class A (PIGA). However, the mutations that do not cause PNH is associated with the multiple congenital anomalies-hypotonia-seizures syndrome 2 (MCAHS2). To best of our knowledge, no computational study has been performed to explore the atomistic level impact of PIGA mutations on the structure and dynamics of the protein. In the current work, we are mainly interested to get insights into the molecular mechanism of PIGA mutations. In the initial step, we screened the most pathogenic mutations from the pool of publicly available mutations. Further, to get a better understanding, pathogenic mutations were mapped to the modeled structure and subjected to 50ns molecular dynamics simulation. Our computational study suggests that four mutations are highly vulnerable to altering the structural conformation and stability of the PIGA protein, which illustrates its association with PNH and MCAHS2 phenotype.Keywords: homology modeling, molecular dynamics simulation, missense mutations PNH, MCAHS2, PIGA
Procedia PDF Downloads 1454337 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning
Authors: Pooja Khanal, Huaming Zhang
Abstract:
Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.Keywords: bug classification, bug labels, GitHub issues, semantic differences
Procedia PDF Downloads 2014336 Hybrid Weighted Multiple Attribute Decision Making Handover Method for Heterogeneous Networks
Authors: Mohanad Alhabo, Li Zhang, Naveed Nawaz
Abstract:
Small cell deployment in 5G networks is a promising technology to enhance capacity and coverage. However, unplanned deployment may cause high interference levels and high number of unnecessary handovers, which in turn will result in an increase in the signalling overhead. To guarantee service continuity, minimize unnecessary handovers, and reduce signalling overhead in heterogeneous networks, it is essential to properly model the handover decision problem. In this paper, we model the handover decision according to Multiple Attribute Decision Making (MADM) method, specifically Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). In this paper, we propose a hybrid TOPSIS method to control the handover in heterogeneous network. The proposed method adopts a hybrid weighting, which is a combination of entropy and standard deviation. A hybrid weighting control parameter is introduced to balance the impact of the standard deviation and entropy weighting on the network selection process and the overall performance. Our proposed method shows better performance, in terms of the number of frequent handovers and the mean user throughput, compared to the existing methods.Keywords: handover, HetNets, interference, MADM, small cells, TOPSIS, weight
Procedia PDF Downloads 1494335 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 1444334 Effect of Climate Variability on Honeybee's Production in Ondo State, Nigeria
Authors: Justin Orimisan Ijigbade
Abstract:
The study was conducted to assess the effect of climate variability on honeybee’s production in Ondo State, Nigeria. Multistage sampling technique was employed to collect the data from 60 beekeepers across six Local Government Areas in Ondo State. Data collected were subjected to descriptive statistics and multiple regression model analyses. The results showed that 93.33% of the respondents were male with 80% above 40 years of age. Majority of the respondents (96.67%) had formal education and 90% produced honey for commercial purpose. The result revealed that 90% of the respondents admitted that low temperature as a result of long hours/period of rainfall affected the foraging efficiency of the worker bees, 73.33% claimed that long period of low humidity resulted in low level of nectar flow, while 70% submitted that high temperature resulted in improper composition of workers, dunes and queen in the hive colony. The result of multiple regression showed that beekeepers’ experience, educational level, access to climate information, temperature and rainfall were the main factors affecting honey bees production in the study area. Therefore, beekeepers should be given more education on climate variability and its adaptive strategies towards ensuring better honeybees production in the study area.Keywords: climate variability, honeybees production, humidity, rainfall and temperature
Procedia PDF Downloads 2724333 A Framework for Designing Complex Product-Service Systems with a Multi-Domain Matrix
Authors: Yoonjung An, Yongtae Park
Abstract:
Offering a Product-Service System (PSS) is a well-accepted strategy that companies may adopt to provide a set of systemic solutions to customers. PSSs were initially provided in a simple form but now take diversified and complex forms involving multiple services, products and technologies. With the growing interest in the PSS, frameworks for the PSS development have been introduced by many researchers. However, most of the existing frameworks fail to examine various relations existing in a complex PSS. Since designing a complex PSS involves full integration of multiple products and services, it is essential to identify not only product-service relations but also product-product/ service-service relations. It is also equally important to specify how they are related for better understanding of the system. Moreover, as customers tend to view their purchase from a more holistic perspective, a PSS should be developed based on the whole system’s requirements, rather than focusing only on the product requirements or service requirements. Thus, we propose a framework to develop a complex PSS that is coordinated fully with the requirements of both worlds. Specifically, our approach adopts a multi-domain matrix (MDM). A MDM identifies not only inter-domain relations but also intra-domain relations so that it helps to design a PSS that includes highly desired and closely related core functions/ features. Also, various dependency types and rating schemes proposed in our approach would help the integration process.Keywords: inter-domain relations, intra-domain relations, multi-domain matrix, product-service system design
Procedia PDF Downloads 6414332 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT
Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez
Abstract:
Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management
Procedia PDF Downloads 1384331 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 3314330 Targeting Calcium Dysregulation for Treatment of Dementia in Alzheimer's Disease
Authors: Huafeng Wei
Abstract:
Dementia in Alzheimer’s Disease (AD) is the number one cause of dementia internationally, without effective treatments. Increasing evidence suggest that disruption of intracellular calcium homeostasis, primarily pathological elevation of cytosol and mitochondria but reduction of endoplasmic reticulum (ER) calcium concentrations, play critical upstream roles on multiple pathologies and associated neurodegeneration, impaired neurogenesis, synapse, and cognitive dysfunction in various AD preclinical studies. The last federal drug agency (FDA) approved drug for AD dementia treatment, memantine, exert its therapeutic effects by ameliorating N-methyl-D-aspartate (NMDA) glutamate receptor overactivation and subsequent calcium dysregulation. More research works are needed to develop other drugs targeting calcium dysregulation at multiple pharmacological acting sites for future effective AD dementia treatment. Particularly, calcium channel blockers for the treatment of hypertension and dantrolene for the treatment of muscle spasm and malignant hyperthermia can be repurposed for this purpose. In our own research work, intranasal administration of dantrolene significantly increased its brain concentrations and durations, rendering it a more effective therapeutic drug with less side effects for chronic AD dementia treatment. This review summarizesthe progress of various studies repurposing drugs targeting calcium dysregulation for future effective AD dementia treatment as potentially disease-modifying drugs.Keywords: alzheimer, calcium, cognitive dysfunction, dementia, neurodegeneration, neurogenesis
Procedia PDF Downloads 1824329 Deep Reinforcement Learning Approach for Optimal Control of Industrial Smart Grids
Authors: Niklas Panten, Eberhard Abele
Abstract:
This paper presents a novel approach for real-time and near-optimal control of industrial smart grids by deep reinforcement learning (DRL). To achieve highly energy-efficient factory systems, the energetic linkage of machines, technical building equipment and the building itself is desirable. However, the increased complexity of the interacting sub-systems, multiple time-variant target values and stochastic influences by the production environment, weather and energy markets make it difficult to efficiently control the energy production, storage and consumption in the hybrid industrial smart grids. The studied deep reinforcement learning approach allows to explore the solution space for proper control policies which minimize a cost function. The deep neural network of the DRL agent is based on a multilayer perceptron (MLP), Long Short-Term Memory (LSTM) and convolutional layers. The agent is trained within multiple Modelica-based factory simulation environments by the Advantage Actor Critic algorithm (A2C). The DRL controller is evaluated by means of the simulation and then compared to a conventional, rule-based approach. Finally, the results indicate that the DRL approach is able to improve the control performance and significantly reduce energy respectively operating costs of industrial smart grids.Keywords: industrial smart grids, energy efficiency, deep reinforcement learning, optimal control
Procedia PDF Downloads 1954328 Resistance Spot Welding of Boron Steel 22MnB5 with Complex Welding Programs
Authors: Szymon Kowieski, Zygmunt Mikno
Abstract:
The study involved the optimization of process parameters during resistance spot welding of Al-coated martensitic boron steel 22MnB5, applied in hot stamping, performed using a programme with a multiple current impulse mode and a programme with variable pressure force. The aim of this research work was to determine the possibilities of a growth in welded joint strength and to identify the expansion of a welding lobe. The process parameters were adjusted on the basis of welding process simulation and confronted with experimental data. 22MnB5 steel is known for its tendency to obtain high hardness values in weld nuggets, often leading to interfacial failures (observed in the study-related tests). In addition, during resistance spot welding, many production-related factors can affect process stability, e.g. welding lobe narrowing, and lead to the deterioration of quality. Resistance spot welding performed using the above-named welding programme featuring 3 levels of force made it possible to achieve 82% of welding lobe extension. Joints made using the multiple current impulse program, where the total welding time was below 1.4s, revealed a change in a peeling mode (to full plug) and an increase in weld tensile shear strength of 10%.Keywords: 22MnB5, hot stamping, interfacial fracture, resistance spot welding, simulation, single lap joint, welding lobe
Procedia PDF Downloads 3874327 A Sectional Control Method to Decrease the Accumulated Survey Error of Tunnel Installation Control Network
Authors: Yinggang Guo, Zongchun Li
Abstract:
In order to decrease the accumulated survey error of tunnel installation control network of particle accelerator, a sectional control method is proposed. Firstly, the accumulation rule of positional error with the length of the control network is obtained by simulation calculation according to the shape of the tunnel installation-control-network. Then, the RMS of horizontal positional precision of tunnel backbone control network is taken as the threshold. When the accumulated error is bigger than the threshold, the tunnel installation control network should be divided into subsections reasonably. On each segment, the middle survey station is taken as the datum for independent adjustment calculation. Finally, by taking the backbone control points as faint datums, the weighted partial parameters adjustment is performed with the adjustment results of each segment and the coordinates of backbone control points. The subsections are jointed and unified into the global coordinate system in the adjustment process. An installation control network of the linac with a length of 1.6 km is simulated. The RMS of positional deviation of the proposed method is 2.583 mm, and the RMS of the difference of positional deviation between adjacent points reaches 0.035 mm. Experimental results show that the proposed sectional control method can not only effectively decrease the accumulated survey error but also guarantee the relative positional precision of the installation control network. So it can be applied in the data processing of tunnel installation control networks, especially for large particle accelerators.Keywords: alignment, tunnel installation control network, accumulated survey error, sectional control method, datum
Procedia PDF Downloads 1914326 Pinch Technology for Minimization of Water Consumption at a Refinery
Authors: W. Mughees, M. Alahmad
Abstract:
Water is the most significant entity that controls local and global development. For the Gulf region, especially Saudi Arabia, with its limited potable water resources, the potential of the fresh water problem is highly considerable. In this research, the study involves the design and analysis of pinch-based water/wastewater networks. Multiple water/wastewater networks were developed using pinch analysis involving direct recycle/material recycle method. Property-integration technique was adopted to carry out direct recycle method. Particularly, a petroleum refinery was considered as a case study. In direct recycle methodology, minimum water discharge and minimum fresh water resource targets were estimated. Re-design (or retrofitting) of water allocation in the networks was undertaken. Chemical Oxygen Demand (COD) and hardness properties were taken as pollutants. This research was based on single and double contaminant approach for COD and hardness and the amount of fresh water was reduced from 340.0 m3/h to 149.0 m3/h (43.8%), 208.0 m3/h (61.18%) respectively. While regarding double contaminant approach, reduction in fresh water demand was 132.0 m3/h (38.8%). The required analysis was also carried out using mathematical programming technique. Operating software such as LINGO was used for these studies which have verified the graphical method results in a valuable and accurate way. Among the multiple water networks, the one possible water allocation network was developed based on mass exchange.Keywords: minimization, water pinch, water management, pollution prevention
Procedia PDF Downloads 4774325 Balance Control Mechanisms in Individuals With Multiple Sclerosis in Virtual Reality Environment
Authors: Badriah Alayidi, Emad Alyahya
Abstract:
Background: Most people with Multiple Sclerosis (MS) report worsening balance as the condition progresses. Poor balance control is also well known to be a significant risk factor for both falling and fear of falling. The increased risk of falls with disease progression thus makes balance control an essential target of gait rehabilitation amongst people with MS. Intervention programs have developed various methods to improve balance control, and accumulating evidence suggests that exercise programs may help people with MS improve their balance. Among these methods, virtual reality (VR) is growing in popularity as a balance-training technique owing to its potential benefits, including better compliance and greater user happiness. However, it is not clear if a VR environment will induce different balance control mechanisms in MS as compared to healthy individuals or traditional environments. Therefore, this study aims to examine how individuals with MS control their balance in a VR setting. Methodology: The proposed study takes an empirical approach to estimate and determine the role of balance response in persons with MS using a VR environment. It will use primary data collected through patient observations, physiological and biomechanical evaluation of balance, and data analysis. Results: The preliminary systematic review and meta-analysis indicated that there was variability in terms of the outcome assessing balance response in people with MS. The preliminary results of these assessments have the potential to provide essential indicators of the progression of MS and contribute to the individualization of treatment and evaluation of the interventions’ effectiveness. The literature describes patients who have had the opportunity to experiment in VR settings and then used what they have learned in the real world, suggesting that this VR setting could be more appealing than conditional settings. The findings of the proposed study will be beneficial in estimating and determining the effect of VR on balance control in persons with MS. In previous studies, VR was shown to be an interesting approach to neurological rehabilitation, but more data are needed to support this approach in MS. Conclusions: The proposed study enables an assessment of balance and evaluations of a variety of physiological implications related to neural activity as well as biomechanical implications related to movement analysis.Keywords: multiple sclerosis, virtual reality, postural control, balance
Procedia PDF Downloads 744324 Broadening the Roles of Masjid: Reviving Prophetic Holistic Model in Fostering Islamic Education and Arabic Language in South-Western Nigeria
Authors: Ahmad Tijani Surajudeen, Muhammad Zahiri Awang Mat, Aliy Abdulwahid Adebisi
Abstract:
With arrival of Islam in the South-Western Nigeria in the late fifteenth and early sixteenth centuries, various masājid established in different parts of the area played vital roles towards the betterment and unity of the Muslims. However, despite the fact that the masājid in the South-Western part of Nigeria contributed immensely to the spiritual and educational enhancement of the Muslims, it has not fully captured the holistic educational roles as a unique model used by the Prophet (S.A.W). Therefore, the primary objective of this paper is to investigate and broaden the roles of masjid towards its compartmentalized and holistic contributions among the Muslims in the south-western Nigeria. The findings from the paper have identified five holistic roles of masjid, namely, spiritual, intellectual, physical, social and emotional contributions which have been exemplified in the prophetic model of masjid. The paper has argued that the five factors must be unreservedly unified towards the betterment of the Muslims and enhancement of Islamic education and Arabic Language in the South-Western Nigeria. However, the challenges of masjid management in the South-Western Nigeria are the main hindrance in achieving the holistic roles of masjid. It is thereby suggested that, the management of masjid should take the identified prophetic model of masjid into account in order to positively improve the affairs of Muslims as well as promoting the teaching and learning of Islamic education and Arabic language among the Muslims in the South-Western Nigeria.Keywords: worship, Islamic education, Arabic language, prophetic holistic model
Procedia PDF Downloads 3334323 Mobility Management for Pedestrian Accident Predictability and Mitigation Strategies Using Multiple
Authors: Oscar Norman Nekesa, Yoshitaka Kajita
Abstract:
Tom Mboya Street is a vital urban corridor within the spectrum of Nairobi city, it experiences high volumes of pedestrian and vehicular traffic. Despite past intervention measures to lessen this catastrophe, rates have remained high. This highlights significant safety concerns that need urgent attention. This study investigates the correlation and pedestrian accident predictability with significant independent variables using multiple linear regression to model to develop effective mobility management strategies for accident mitigation. The methodology involves collecting and analyzing data on pedestrian accidents and various related independent variables. Data sources include the National Transport and Safety Authority (NTSA), Kenya National Bureau of Statistics, and Nairobi City County records, covering five years. This study aims to investigate that traffic volumes (pedestrian and vehicle), Vehicular speed, human factors, illegal parking, policy issues, urban-land use, built environment, traffic signals conditions, inadequate lighting, and insufficient traffic control measures significantly have predictability with the rate of pedestrian accidents. Explanatory variables related to road design and geometry are significant in predictor models for the Tom Mboya Road link but less influential in junction along the 5 km stretch road models. The most impactful variable across all models was vehicular traffic flow. The study recommends infrastructural improvements, enhanced enforcement, and public awareness campaigns to reduce accidents and improve urban mobility. These insights can inform policy-making and urban planning to enhance pedestrian safety along the dense packed Tom Mboya Street and similar urban settings. The findings will inform evidence-based interventions to enhance pedestrian safety and improve urban mobility.Keywords: multiple linear regression, urban mobility, traffic management, Nairobi, Tom Mboya street, infrastructure conditions., pedestrian safety, correlation and prediction
Procedia PDF Downloads 254322 Determination of Safe Ore Extraction Methodology beneath Permanent Extraction in a Lead Zinc Mine with the Help of FLAC3D Numerical Model
Authors: Ayan Giri, Lukaranjan Phukan, Shantanu Karmakar
Abstract:
Structure and tectonics play a vital role in ore genesis and deposition. The existence of a swelling structure below the current level of a mine leads to the discovery of ores below some permeant developments of the mine. The discovery and the extraction of the ore body are very critical to sustain the business requirement of the mine. The challenge was to extract the ore without hampering the global stability of the mine. In order to do so, different mining options were considered and analysed by numerical modelling in FLAC3d software. The constitutive model prepared for this simulation is the improved unified constitutive model, which can better and more accurately predict the stress-strain relationships in a continuum model. The IUCM employs the Hoek-Brown criterion to determine the instantaneous Mohr-Coulomb parameters cohesion (c) and friction (ɸ) at each level of confining stress. The extra swelled part can be dimensioned as north-south strike width 50m, east-west strike width 50m. On the north side, already a stope (P1) is excavated of the dimension of 25m NS width. The different options considered were (a) Open stoping of extraction of southern part (P0) of 50m to the full extent, (b) Extraction of the southern part of 25m, then filling of both the primaries and extraction of secondary (S0) 25m in between. (c) Extraction of the southern part (P0) completely, preceded by backfill and modify the design of the secondary (S0) for the overall stability of the permanent excavation above the stoping.Keywords: extraction, IUCM, FLAC 3D, stoping, tectonics
Procedia PDF Downloads 2124321 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain
Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA
Abstract:
In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.Keywords: BER, DWT, extreme leaning machine (ELM), PSNR
Procedia PDF Downloads 3114320 The Actoprotective Efficiency of Pyrimidine Derivatives
Authors: Nail Nazarov, Vladimir Zobov, Alexandra Vyshtakalyuk, Vyacheslav Semenov, Irina Galyametdinova, Vladimir Reznik
Abstract:
There have been studied effects of xymedon and six new pyrimidine derivatives, that are close and distant analogs of xymedon, on rats' working capacity in the test 'swimming to failure'. It has been shown that a single administration of the studied compounds did not have a statistically significant effect in the test. In the conditions of multiple intraperitoneal administration of the studied pyrimidine derivatives, the compound L-ascorbate, 1-(2-hydroxyethyl)-4.6-dimethyl-1.2-dihydropyrimidine-2-one had the lowest toxicity and the most pronounced actoprotective effect. Introduction in the dose of 20 mg/kg caused a statistically significant increase 440 % in the duration of swimming of rats on the 14th day of the experiment compared with the control group. Multiple administration of the compound in the conditions of physical load did not affect leucopoiesis but stimulates erythropoiesis resulting in an increase in the number of erythrocytes and a hemoglobin level. The substance introduction under mixed exhausting loads prevented such changes of blood biochemical parameters as reduction of glucose, increased of urea and lactic acid levels, what indicates improvement in the animals' tolerability of loads and an anti-catabolic effect of the compound. Absence of hepato and cardiotoxic effects of the substance has been shown. This work was performed with the financial support of Russian Science Foundation (grant № 14-50-00014).Keywords: actoprotectors, physical working capacity, pyrimidine derivatives, xymedon
Procedia PDF Downloads 2914319 Evaluating the Performance of 28 EU Member Countries on Health2020: A Data Envelopment Analysis Evaluation of the Successful Implementation of Policies
Authors: Elias K. Maragos, Petros E. Maravelakis, Apostolos I. Linardis
Abstract:
Health2020 is a promising framework of policies provided by the World Health Organization (WHO) and aiming to diminish the health and well-being inequalities among the citizens of the European Union (EU) countries. The major demographic, social and environmental changes, in addition to the resent economic crisis prevent the unobstructed and successful implementation of this framework. The unemployment rates and the percentage of people at risk of poverty have increased among the citizens of EU countries. At the same time, the adopted fiscal, economic policies do not help governments to serve their social role and mitigate social and health inequalities. In those circumstances, there is a strong pressure to organize all health system resources efficiently and wisely. In order to provide a unified and value-based framework of valuation, we propose a valuation framework using data envelopment analysis (DEA) and dynamic DEA. We believe that the adopted methodology could provide a robust tool which can capture the degree of success with which policies have been implemented and is capable to determine which of the countries developed the requested policies efficiently and which of the countries have been lagged. Using the proposed methodology, we evaluated the performance of 28 EU member-countries in relation to the Health2020 peripheral targets. We adopted several versions of evaluation, measuring the effectiveness and the efficiency of EU countries from 2011 to 2016. Our results showed stability in technological changes and revealed a group of countries which were benchmarks in most of the years for the inefficient countries.Keywords: DEA, Health2020, health inequalities, malmquist index, policies evaluation, well-being
Procedia PDF Downloads 1434318 Lennox-gastaut Syndrome Associated with Dysgenesis of Corpus Callosum
Authors: A. Bruce Janati, Muhammad Umair Khan, Naif Alghassab, Ibrahim Alzeir, Assem Mahmoud, M. Sammour
Abstract:
Rationale: Lennox-Gastaut syndrome(LGS) is an electro-clinical syndrome composed of the triad of mental retardation, multiple seizure types, and the characteristic generalized slow spike-wave complexes in the EEG. In this article, we report on two patients with LGS whose brain MRI showed dysgenesis of corpus callosum(CC). We review the literature and stress the role of CC in the genesis of secondary bilateral synchrony(SBS). Method: This was a clinical study conducted at King Khalid Hospital. Results: The EEG was consistent with LGS in patient 1 and unilateral slow spike-wave complexes in patient 2. The MRI showed hypoplasia of the splenium of CC in patient 1, and global hypoplasia of CC combined with Joubert syndrome in patient 2. Conclusion: Based on the data, we proffer the following hypotheses: 1-Hypoplasia of CC interferes with functional integrity of this structure. 2-The genu of CC plays a pivotal role in the genesis of secondary bilateral synchrony. 3-Electrodecremental seizures in LGS emanate from pacemakers generated in the brain stem, in particular the mesencephalon projecting abnormal signals to the cortex via thalamic nuclei. 4-Unilateral slow spike-wave complexes in the context of mental retardation and multiple seizure types may represent a variant of LGS, justifying neuroimaging studies.Keywords: EEG, Lennox-Gastaut syndrome, corpus callosum , MRI
Procedia PDF Downloads 4464317 A Transformer-Based Approach for Multi-Human 3D Pose Estimation Using Color and Depth Images
Authors: Qiang Wang, Hongyang Yu
Abstract:
Multi-human 3D pose estimation is a challenging task in computer vision, which aims to recover the 3D joint locations of multiple people from multi-view images. In contrast to traditional methods, which typically only use color (RGB) images as input, our approach utilizes both color and depth (D) information contained in RGB-D images. We also employ a transformer-based model as the backbone of our approach, which is able to capture long-range dependencies and has been shown to perform well on various sequence modeling tasks. Our method is trained and tested on the Carnegie Mellon University (CMU) Panoptic dataset, which contains a diverse set of indoor and outdoor scenes with multiple people in varying poses and clothing. We evaluate the performance of our model on the standard 3D pose estimation metrics of mean per-joint position error (MPJPE). Our results show that the transformer-based approach outperforms traditional methods and achieves competitive results on the CMU Panoptic dataset. We also perform an ablation study to understand the impact of different design choices on the overall performance of the model. In summary, our work demonstrates the effectiveness of using a transformer-based approach with RGB-D images for multi-human 3D pose estimation and has potential applications in real-world scenarios such as human-computer interaction, robotics, and augmented reality.Keywords: multi-human 3D pose estimation, RGB-D images, transformer, 3D joint locations
Procedia PDF Downloads 804316 Unified Power Quality Conditioner Presentation and Dimensioning
Authors: Abderrahmane Kechich, Othmane Abdelkhalek
Abstract:
Static converters behave as nonlinear loads that inject harmonic currents into the grid and increase the consumption of the inactive power. On the other hand, the increased use of sensitive equipment requires the application of sinusoidal voltages. As a result, the electrical power quality control has become a major concern in the field of power electronics. In this context, the active power conditioner (UPQC) was developed. It combines both serial and parallel structures; the series filter can protect sensitive loads and compensate for voltage disturbances such as voltage harmonics, voltage dips or flicker when the shunt filter compensates for current disturbances such as current harmonics, reactive currents and imbalance. This double feature is that it is one of the most appropriate devices. Calculating parameters is an important step and in the same time it’s not easy for that reason several researchers based on trial and error method for calculating parameters but this method is not easy for beginners researchers especially what about the controller’s parameters, for that reason this paper gives a mathematical way to calculate of almost all of UPQC parameters away from trial and error method. This paper gives also a new approach for calculating of PI regulators parameters for purpose to have a stable UPQC able to compensate for disturbances acting on the waveform of line voltage and load current in order to improve the electrical power quality.Keywords: UPQC, Shunt active filer, series active filer, PI controller, PWM control, dual-loop control
Procedia PDF Downloads 4034315 Synthesis and Characterization of New Thermotropic Monomers – Containing Phosphorus
Authors: Diana Serbezeanu, Ionela-Daniela Carja, Tachita Vlad-Bubulac, Sergiu Sova
Abstract:
New phosphorus-containing monomers having methoxy end functional groups were prepared from methyl 4-hydroxybenzoate and two different dichlorides with phosphorus, namely phenyl phosphonic dichloride and phenyl dichlorophosphate. The structures of the monomers were confirmed by FTIR and NMR spectroscopy. The assignments for the 1H, 13C and 31P chemical shifts are based on 1D and 2D NMR homo- and heteronuclear correlations (H,H-COSY (Correlation Spectroscopy), H,C-HMQC (Heteronuclear Multiple Quantum Correlation and H,C-HMBC (Heteronuclear Multiple Bond Correlation)) and 31P-13C couplings. The monomers exhibited good solubility in common organic solvents. Dimethyl sulfoxide was to be a good solvent to grow crystals of considerable size which were investigated by X-ray analysis. One of these two new monomers presented thermotropic liquid crystalline behaviour, as revealed by differential scanning calorimetry (DSC), polarized light microscopy (PLM) and X-ray diffraction (XRD). The transition temperature from crystal to liquid crystalline state (K→LC) was 143°C and from the LC to isotropic state (LC→I) was 167°C. Upon heating, bis(4-(methoxycarbonyl)phenyl formed fine textures, difficult to be ascribed to smectic or nematic phases. Upon cooling from the isotropic state, bis(4-(methoxycarbonyl)phenyl exhibited a mosaic-type texture. X-ray diffraction measurements at small angles (SAXS) of bis(4-(methoxycarbonyl)phenyl showed two peaks at 1.8 Å and 3.5 Å, respectively suggesting organization at supramolecular level.Keywords: phosphorus-containing monomers, polarized light microscopy, structure investigation, thermotropic liquid crystalline properties
Procedia PDF Downloads 2994314 The Relationship between First-Day Body Temperature and Mortality in Traumatic Patients
Authors: Neda Valizadeh, Mani Mofidi, Sama Haghighi, Ali Hashemaghaee, Soudabeh Shafiee Ardestani
Abstract:
Background: There are many systems and parameters to evaluate trauma patients in the emergency department. Most of these evaluations are to distinguish patients with worse conditions so that the care systems have a better prediction of condition for a better care-giving. The purpose of this study is to determine the relationship between axillary body temperature and mortality in patients hospitalized in the intensive care unit (ICU) with multiple traumas and with other clinical and para-clinical factors. Methods: All patients between 16 and 75 years old with multiple traumas who were admitted into Emergency Department then hospitalized in the ICU were included in our study. An axillary temperature in the first and the second day of admission, Glasgow cola scale (GCS), systolic blood pressure, Serum glucose levels, and white blood cell counts of all patients at the admission day were recorded and their relationship with mortality were analyzed by SPSS software with suitable statistical tests. Results: Axillary body temperatures in the first and second day were statistically lower in expired traumatic patients (p=0.001 and p<0,001 respectively). Patients with lower GCS had a significantly lower first-day temperature and a significantly higher mortality. (p=0.006 and p=0.006 respectively). Furthermore, the first-day axillary temperature was significantly lower in patients with a lower first-day systolic blood pressure (p=0.014). Conclusion: Our results showed that lower axillary body temperature in the first day is associated with higher mortality, lower GCS, and lower systolic blood pressure. Thus, this could be used as a predictor of mortality in evaluation of traumatic patients in emergency settings.Keywords: fever, trauma, mortality, emergency
Procedia PDF Downloads 3764313 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks
Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel
Abstract:
The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy
Procedia PDF Downloads 3004312 Micro-Channel Flows Simulation Based on Nonlinear Coupled Constitutive Model
Authors: Qijiao He
Abstract:
MicroElectrical-Mechanical System (MEMS) is one of the most rapidly developing frontier research field both in theory study and applied technology. Micro-channel is a very important link component of MEMS. With the research and development of MEMS, the size of the micro-devices and the micro-channels becomes further smaller. Compared with the macroscale flow, the flow characteristics of gas in the micro-channel have changed, and the rarefaction effect appears obviously. However, for the rarefied gas and microscale flow, Navier-Stokes-Fourier (NSF) equations are no longer appropriate due to the breakup of the continuum hypothesis. A Nonlinear Coupled Constitutive Model (NCCM) has been derived from the Boltzmann equation to describe the characteristics of both continuum and rarefied gas flows. We apply the present scheme to simulate continuum and rarefied gas flows in a micro-channel structure. And for comparison, we apply other widely used methods which based on particle simulation or direct solution of distribution function, such as Direct simulation of Monte Carlo (DSMC), Unified Gas-Kinetic Scheme (UGKS) and Lattice Boltzmann Method (LBM), to simulate the flows. The results show that the present solution is in better agreement with the experimental data and the DSMC, UGKS and LBM results than the NSF results in rarefied cases but is in good agreement with the NSF results in continuum cases. And some characteristics of both continuum and rarefied gas flows are observed and analyzed.Keywords: continuum and rarefied gas flows, discontinuous Galerkin method, generalized hydrodynamic equations, numerical simulation
Procedia PDF Downloads 1724311 Using Multiple Strategies to Improve the Nursing Staff Edwards Lifesciences Hemodynamic Monitoring Correctness of Operation
Authors: Hsin-Yi Lo, Huang-Ju Jiun, Yu-Chiao Chu
Abstract:
Hemodynamic monitoring is an important in the intensive care unit. Advances in medical technology in recent years, more diversification of intensive care equipment, there are many kinds of instruments available for monitoring of hemodynamics, Edwards Lifesciences Hemodynamic Monitoring (FloTrac) is one of them. The recent medical safety incidents in parameters were changed, nurses have not to notify doctor in time, therefore, it is hoped to analyze the current problems and find effective improvement strategies. In August 2021, the survey found that only 74.0% of FloTrac correctness of operation, reasons include lack of education, the operation manual is difficulty read, lack of audit mechanism, nurse doesn't know those numerical changes need to notify doctor, work busy omission, unfamiliar with operation and have many nursing records then omissions. Improvement methods include planning professional nurse education, formulate the secret arts of FloTrac, enacting an audit mechanism, establish FloTrac action learning, make「follow the sun」care map, hold simulated training and establish monitoring data automatically upload nursing records. After improvement, FloTrac correctness of operation increased to 98.8%. The results are good, implement to the ICU of the hospital.Keywords: hemodynamic monitoring, edwards lifesciences hemodynamic monitoring, multiple strategies, intensive care
Procedia PDF Downloads 814310 Study on the Stability of Large Space Expandable Parabolic Cylindrical Antenna
Authors: Chuanzhi Chen, Wenjing Yu
Abstract:
Parabolic cylindrical deployable antenna has the characteristics of wide cutting width, strong directivity, high gain, and easy automatic beam scanning. While, due to its large size, high flexibility, and strong coupling, the deployment process of parabolic cylindrical deployable antenna presents such problems as unsynchronized deployment speed, large local deformation and discontinuous switching of deployment state. A large deployable parabolic cylindrical antenna is taken as the research object, and the problem of unfolding process instability of cylindrical antenna is studied in the paper, which is caused by multiple factors such as multiple closed loops, elastic deformation, motion friction, and gap collision. Firstly, the multi-flexible system dynamics model of large-scale parabolic cylindrical antenna is established to study the influence of friction and elastic deformation on the stability of large multi-closed loop antenna. Secondly, the evaluation method of antenna expansion stability is studied, and the quantitative index of antenna configuration design is proposed to provide a theoretical basis for improving the overall performance of the antenna. Finally, through simulation analysis and experiment, the development dynamics and stability of large-scale parabolic cylindrical antennas are verified by in-depth analysis, and the principles for improving the stability of antenna deployment are summarized.Keywords: multibody dynamics, expandable parabolic cylindrical antenna, stability, flexible deformation
Procedia PDF Downloads 146