Search results for: gradual change detection
6540 A Case of Bilateral Vulval Abscess with Pelvic Fistula in an Immunocompromised Patient with Colostomy: A Diagnostic Challenge
Authors: Paul Feyi Waboso
Abstract:
This case report presents a 57-year-old female patient with a history of colon cancer, colostomy, and immunocompromise, who presented with an unusual bilateral vulval abscess, more prominent on the left side. Due to the atypical presentation, an MRI was performed, revealing a pelvic collection and a fistulous connection between the pelvis and vulva. This finding prompted an urgent surgical intervention. This case highlights the diagnostic and therapeutic challenges of managing complex abscesses and fistulas in immunocompromised patients. Introduction: Vulval abscesses in immunocompromised individuals can present with atypical features and may be associated with complex pathologies. Patients with a history of cancer, colostomy, and immunocompromise are particularly prone to infections and may present with unusual manifestations. This report discusses a case of a large bilateral vulval abscess with an underlying pelvic fistula, emphasizing the importance of advanced imaging in cases with atypical presentations. Case Presentation: A 57-year-old female with a known history of colon cancer, treated with colostomy, presented with severe pain and swelling in the vulval area. Physical examination revealed bilateral vulval swelling, with the abscess on the left side appearing larger and more pronounced than on the right. Given her immunocompromised status and the unusual nature of the presentation, we requested an MRI of the pelvis, suspecting an underlying pathology beyond a typical abscess. Investigations: MRI imaging revealed a significant pelvic collection and identified a fistulous tract between the pelvis and the vulva. This confirmed that the vulval abscess was connected to a deeper pelvic infection, necessitating urgent intervention. Management: After consultation with the multidisciplinary team (MDT), it was agreed that the patient required surgical intervention, having had 48 hours of antibiotics. The patient underwent evacuation of the left-sided vulval abscess under spinal anesthesia. During surgery, the pelvic collection was drained of 200 ml of pus. Outcome and Follow-Up: Postoperative recovery was closely monitored due to the patient’s immunocompromised state. Follow-up imaging and clinical evaluation showed improvement in symptoms, with gradual resolution of infection. The patient was scheduled for regular follow-up visits to monitor for recurrence or further complications. Discussion: Bilateral vulval abscesses are uncommon and, in an immunocompromised patient, warrant thorough investigation to rule out deeper infectious or fistulous connections. This case underscores the utility of MRI in identifying complex fistulous tracts and highlights the importance of a multidisciplinary approach in managing such high-risk patients. Conclusion: This case illustrates a rare presentation of bilateral vulval abscess with an associated pelvic fistula.Keywords: vulval abscess, MDT team, colon cancer with pelvic fistula, vulval skin condition
Procedia PDF Downloads 186539 Enabling Non-invasive Diagnosis of Thyroid Nodules with High Specificity and Sensitivity
Authors: Sai Maniveer Adapa, Sai Guptha Perla, Adithya Reddy P.
Abstract:
Thyroid nodules can often be diagnosed with ultrasound imaging, although differentiating between benign and malignant nodules can be challenging for medical professionals. This work suggests a novel approach to increase the precision of thyroid nodule identification by combining machine learning and deep learning. The new approach first extracts information from the ultrasound pictures using a deep learning method known as a convolutional autoencoder. A support vector machine, a type of machine learning model, is then trained using these features. With an accuracy of 92.52%, the support vector machine can differentiate between benign and malignant nodules. This innovative technique may decrease the need for pointless biopsies and increase the accuracy of thyroid nodule detection.Keywords: thyroid tumor diagnosis, ultrasound images, deep learning, machine learning, convolutional auto-encoder, support vector machine
Procedia PDF Downloads 586538 Enhancing of Laser Imaging by Using Ultrasound Effect
Authors: Hayder Raad Hafuze, Munqith Saleem Dawood, Jamal Abdul Jabbar
Abstract:
The effect of using both ultrasounds with laser in medical imaging of the biological tissue has been studied in this paper. Different wave lengths of incident laser light (405 nm, 532 nm, 650 nm, 808 nm and 1064 nm) were used with different ultrasound frequencies (1MHz and 3.3MHz). The results showed that, the change of acoustic intensity enhance the laser penetration of the tissue for different thickness. The existence of the ideal Raman-Nath diffraction pattern were investigated in terms of phase delay and incident angle.Keywords: tissue, laser, ultrasound, effect, imaging
Procedia PDF Downloads 4336537 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 886536 Detection Efficient Enterprises via Data Envelopment Analysis
Authors: S. Turkan
Abstract:
In this paper, the Turkey’s Top 500 Industrial Enterprises data in 2014 were analyzed by data envelopment analysis. Data envelopment analysis is used to detect efficient decision-making units such as universities, hospitals, schools etc. by using inputs and outputs. The decision-making units in this study are enterprises. To detect efficient enterprises, some financial ratios are determined as inputs and outputs. For this reason, financial indicators related to productivity of enterprises are considered. The efficient foreign weighted owned capital enterprises are detected via super efficiency model. According to the results, it is said that Mercedes-Benz is the most efficient foreign weighted owned capital enterprise in Turkey.Keywords: data envelopment analysis, super efficiency, logistic regression, financial ratios
Procedia PDF Downloads 3246535 Land Tenure and Erosion as Determinants of Guerrilla Violence in Assam, India: An Ethnographic and Remote Sensing Approach
Authors: Kevin T. Inks
Abstract:
India’s Brahmaputra River Valley has, since independence, experienced consistent low-intensity guerrilla warfare between ethnic and religious groups. These groups are often organized around perceived ethnic territoriality, and target civilians, communities, and especially migrants belonging to other ethnic and religious groups. Intense flooding and erosion have led to widespread displacement, and disaster relief funds are largely tied to legal land tenure. Displaced residents of informal settlements receive little or no resettlement aid, and their subsequent migration strategies and risk from guerrilla violence are poorly understood. Semi-structured interviews and comprehensive surveys focused on perceptions of risk, efficacy of disaster relief, and migration and adaptation strategies were conducted with households identified as being ‘at-risk’ of catastrophic flooding and erosion in Majuli District, Assam. Interviews with policymakers and government workers were conducted to assess disaster relief efforts in informal settlements, and remote sensing methods were used to identify informal settlement and hydrogeomorphic change. The results show that various ethnic and religious groups have differential strategies and preferences for resettlement. However, these varying strategies are likely to lead to differential levels of risk from guerrilla violence. Members of certain ethnic groups residing in informal settlements, in the absence of resettlement assistance, are more likely to seek out unofficial settlement on land far from the protection of the state and experience greater risk of becoming victims of political violence. As climate change and deforestation are likely to increase the severity of the displacement crisis in the Brahmaputra River Valley, more comprehensive disaster relief and surveying efforts are vital for limiting migration and informal settlement in potential sites of guerrilla warfare.Keywords: climate, displacement, flooding, India, violence
Procedia PDF Downloads 1056534 An Overview of Bioinformatics Methods to Detect Novel Riboswitches Highlighting the Importance of Structure Consideration
Authors: Danny Barash
Abstract:
Riboswitches are RNA genetic control elements that were originally discovered in bacteria and provide a unique mechanism of gene regulation. They work without the participation of proteins and are believed to represent ancient regulatory systems in the evolutionary timescale. One of the biggest challenges in riboswitch research is that many are found in prokaryotes but only a small percentage of known riboswitches have been found in certain eukaryotic organisms. The few examples of eukaryotic riboswitches were identified using sequence-based bioinformatics search methods that include some slight structural considerations. These pattern-matching methods were the first ones to be applied for the purpose of riboswitch detection and they can also be programmed very efficiently using a data structure called affix arrays, making them suitable for genome-wide searches of riboswitch patterns. However, they are limited by their ability to detect harder to find riboswitches that deviate from the known patterns. Several methods have been developed since then to tackle this problem. The most commonly used by practitioners is Infernal that relies on Hidden Markov Models (HMMs) and Covariance Models (CMs). Profile Hidden Markov Models were also carried out in the pHMM Riboswitch Scanner web application, independently from Infernal. Other computational approaches that have been developed include RMDetect by the use of 3D structural modules and RNAbor that utilizes Boltzmann probability of structural neighbors. We have tried to incorporate more sophisticated secondary structure considerations based on RNA folding prediction using several strategies. The first idea was to utilize window-based methods in conjunction with folding predictions by energy minimization. The moving window approach is heavily geared towards secondary structure consideration relative to sequence that is treated as a constraint. However, the method cannot be used genome-wide due to its high cost because each folding prediction by energy minimization in the moving window is computationally expensive, enabling to scan only at the vicinity of genes of interest. The second idea was to remedy the inefficiency of the previous approach by constructing a pipeline that consists of inverse RNA folding considering RNA secondary structure, followed by a BLAST search that is sequence-based and highly efficient. This approach, which relies on inverse RNA folding in general and our own in-house fragment-based inverse RNA folding program called RNAfbinv in particular, shows capability to find attractive candidates that are missed by Infernal and other standard methods being used for riboswitch detection. We demonstrate attractive candidates found by both the moving-window approach and the inverse RNA folding approach performed together with BLAST. We conclude that structure-based methods like the two strategies outlined above hold considerable promise in detecting riboswitches and other conserved RNAs of functional importance in a variety of organisms.Keywords: riboswitches, RNA folding prediction, RNA structure, structure-based methods
Procedia PDF Downloads 2346533 The Prediction of Evolutionary Process of Coloured Vision in Mammals: A System Biology Approach
Authors: Shivani Sharma, Prashant Saxena, Inamul Hasan Madar
Abstract:
Since the time of Darwin, it has been considered that genetic change is the direct indicator of variation in phenotype. But a few studies in system biology in the past years have proposed that epigenetic developmental processes also affect the phenotype thus shifting the focus from a linear genotype-phenotype map to a non-linear G-P map. In this paper, we attempt at explaining the evolution of colour vision in mammals by taking LWS/ Long-wave sensitive gene under consideration.Keywords: evolution, phenotypes, epigenetics, LWS gene, G-P map
Procedia PDF Downloads 5216532 Empowering Indigenous Epistemologies in Geothermal Development
Authors: Te Kīpa Kēpa B. Morgan, Oliver W. Mcmillan, Dylan N. Taute, Tumanako N. Fa'aui
Abstract:
Epistemologies are ways of knowing. Indigenous Peoples are aware that they do not perceive and experience the world in the same way as others. So it is important when empowering Indigenous epistemologies, such as that of the New Zealand Māori, to also be able to represent a scientific understanding within the same analysis. A geothermal development assessment tool has been developed by adapting the Mauri Model Decision Making Framework. Mauri is a metric that is capable of representing the change in the life-supporting capacity of things and collections of things. The Mauri Model is a method of grouping mauri indicators as dimension averages in order to allow holistic assessment and also to conduct sensitivity analyses for the effect of worldview bias. R-shiny is the coding platform used for this Vision Mātauranga research which has created an expert decision support tool (DST) that combines a stakeholder assessment of worldview bias with an impact assessment of mauri-based indicators to determine the sustainability of proposed geothermal development. The initial intention was to develop guidelines for quantifying mātauranga Māori impacts related to geothermal resources. To do this, three typical scenarios were considered: a resource owner wishing to assess the potential for new geothermal development; another party wishing to assess the environmental and cultural impacts of the proposed development; an assessment that focuses on the holistic sustainability of the resource, including its surface features. Indicator sets and measurement thresholds were developed that are considered necessary considerations for each assessment context and these have been grouped to represent four mauri dimensions that mirror the four well-being criteria used for resource management in Aotearoa, New Zealand. Two case studies have been conducted to test the DST suitability for quantifying mātauranga Māori and other biophysical factors related to a geothermal system. This involved estimating mauri0meter values for physical features such as temperature, flow rate, frequency, colour, and developing indicators to also quantify qualitative observations about the geothermal system made by Māori. A retrospective analysis has then been conducted to verify different understandings of the geothermal system. The case studies found that the expert DST is useful for geothermal development assessment, especially where hapū (indigenous sub-tribal grouping) are conflicted regarding the benefits and disadvantages of their’ and others’ geothermal developments. These results have been supplemented with evaluations for the cumulative impacts of geothermal developments experienced by different parties using integration techniques applied to the time history curve of the expert DST worldview bias weighted plotted against the mauri0meter score. Cumulative impacts represent the change in resilience or potential of geothermal systems, which directly assists with the holistic interpretation of change from an Indigenous Peoples’ perspective.Keywords: decision support tool, holistic geothermal assessment, indigenous knowledge, mauri model decision-making framework
Procedia PDF Downloads 1876531 A Life Cycle Assessment of Greenhouse Gas Emissions from the Traditional and Climate-smart Farming: A Case of Dhanusha District, Nepal
Authors: Arun Dhakal, Geoff Cockfield
Abstract:
This paper examines the emission potential of different farming practices that the farmers have adopted in Dhanusha District of Nepal and scope of these practices in climate change mitigation. Which practice is more climate-smarter is the question that this aims to address through a life cycle assessment (LCA) of greenhouse gas (GHG) emissions. The LCA was performed to assess if there is difference in emission potential of broadly two farming systems (agroforestry–based and traditional agriculture) but specifically four farming systems. The required data for this was collected through household survey of randomly selected households of 200. The sources of emissions across the farming systems were paddy cultivation, livestock, chemical fertilizer, fossil fuels and biomass (fuel-wood and crop residue) burning. However, the amount of emission from these sources varied with farming system adopted. Emissions from biomass burning appeared to be the highest while the source ‘fossil fuel’ caused the lowest emission in all systems. The emissions decreased gradually from agriculture towards the highly integrated agroforestry-based farming system (HIS), indicating that integrating trees into farming system not only sequester more carbon but also help in reducing emissions from the system. The annual emissions for HIS, Medium integrated agroforestry-based farming system (MIS), LIS (less integrated agroforestry-based farming system and subsistence agricultural system (SAS) were 6.67 t ha-1, 8.62 t ha-1, 10.75 t ha-1 and 17.85 t ha-1 respectively. In one agroforestry cycle, the HIS, MIS and LIS released 64%, 52% and 40% less GHG emission than that of SAS. Within agroforestry-based farming systems, the HIS produced 25% and 50% less emissions than those of MIS and LIS respectively. Our finding suggests that a tree-based farming system is more climate-smarter than a traditional farming. If other two benefits (carbon sequestered within the farm and in the natural forest because of agroforestry) are to be considered, a considerable amount of emissions is reduced from a climate-smart farming. Some policy intervention is required to motivate farmers towards adopting such climate-friendly farming practices in developing countries.Keywords: life cycle assessment, greenhouse gas, climate change, farming systems, Nepal
Procedia PDF Downloads 6196530 Use of Misoprostol in Pregnancy Termination in the Third Trimester: Oral versus Vaginal Route
Authors: Saimir Cenameri, Arjana Tereziu, Kastriot Dallaku
Abstract:
Introduction: Intra-uterine death is a common problem in obstetrical practice, and can lead to complications if left to resolve spontaneously. The cervix is unprepared, making inducing of labor difficult. Misoprostol is a synthetic prostaglandin E1 analogue, inexpensive, and is presented valid thanks to its ability to bring about changes in the cervix that lead to the induction of uterine contractions. Misoprostol is quickly absorbed when taken orally, resulting in high initial peak serum concentrations compared with the vaginal route. The vaginal misoprostol peak serum concentration is not as high and demonstrates a more gradual serum concentration decline. This is associated with many benefits for the patient; fast induction of labor; smaller doses; and fewer side effects (dose-depended). Mostly it has been used the regime of 50 μg/4 hour, with a high percentage of success and limited side effects. Objective: Evaluation of the efficiency of the use of oral and vaginal misoprostol in inducing labor, and comparing it with its use not by a previously defined protocol. Methods: Participants in this study included patients at U.H.O.G. 'Koco Gliozheni', Tirana from April 2004-July 2006, presenting with an indication for inducing labor in the third trimester for pregnancy termination. A total of 37 patients were randomly admitted for birth inducing activity, according to protocol (26), oral or vaginal protocol (10 vs. 16), and a control group (11), not subject to the protocol, was created. Oral or vaginal misoprostol was administered at a dose of 50 μg/4 h, while the fourth group participants were treated individually by the members of the medical staff. The main result of interest was the time between induction of labor to birth. Kruskal-Wallis test was used to compare the average age, parity, women weight, gestational age, Bishop's score, the size of the uterus and weight of the fetus between the four groups in the study. The Fisher exact test was used to compare day-stay and causes in the four groups. Mann-Whitney test was used to compare the time of the expulsion and the number of doses between oral and vaginal group. For all statistical tests used, the value of P ≤ 0.05 was considered statistically significant. Results: The four groups were comparable with regard to woman age and weight, parity, abortion indication, Bishop's score, fetal weight and the gestational age. There was significant difference in the percentage of deliveries within 24 hours. The average time from induction to birth per route (vaginal, oral, according to protocol and not according to the protocol) was respectively; 10.43h; 21.10h; 15.77h, 21.57h. There was no difference in maternal complications in groups. Conclusions: Use of vaginal misoprostol for inducing labor in the third trimester for termination of pregnancy appears to be more effective than the oral route, and even more to uses not according to the protocols approved before, where complications are greater and unjustified.Keywords: inducing labor, misoprostol, pregnancy termination, third trimester
Procedia PDF Downloads 1856529 Analysis of Direct Current Motor in LabVIEW
Authors: E. Ramprasath, P. Manojkumar, P. Veena
Abstract:
DC motors have been widely used in the past centuries which are proudly known as the workhorse of industrial systems until the invention of the AC induction motors which makes a huge revolution in industries. Since then, the use of DC machines have been decreased due to enormous factors such as reliability, robustness and complexity but it lost its fame due to the losses. A new methodology is proposed to construct a DC motor through the simulation in LabVIEW to get an idea about its real time performances, if a change in parameter might have bigger improvement in losses and reliability.Keywords: analysis, characteristics, direct current motor, LabVIEW software, simulation
Procedia PDF Downloads 5526528 Automatic Classification for the Degree of Disc Narrowing from X-Ray Images Using CNN
Authors: Kwangmin Joo
Abstract:
Automatic detection of lumbar vertebrae and classification method is proposed for evaluating the degree of disc narrowing. Prior to classification, deep learning based segmentation is applied to detect individual lumbar vertebra. M-net is applied to segment five lumbar vertebrae and fine-tuning segmentation is employed to improve the accuracy of segmentation. Using the features extracted from previous step, clustering technique, k-means clustering, is applied to estimate the degree of disc space narrowing under four grade scoring system. As preliminary study, techniques proposed in this research could help building an automatic scoring system to diagnose the severity of disc narrowing from X-ray images.Keywords: Disc space narrowing, Degenerative disc disorders, Deep learning based segmentation, Clustering technique
Procedia PDF Downloads 1256527 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients
Authors: Karina Zaccari, Ernesto Cordeiro Marujo
Abstract:
This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.Keywords: machine learning, medical diagnosis, meningitis detection, pediatric research
Procedia PDF Downloads 1506526 Analysis of the Torque Required for Mixing LDPE with Natural Fibre and DCP
Authors: A. E. Delgado, W. Aperador
Abstract:
This study evaluated the incidence of concentrated natural fibre, as well as the effects of adding a crosslinking agent on the torque when those components are mixed with low density polyethylene (LDPE). The natural fibre has a particle size of between 0.8-1.2mm and a moisture content of 0.17%. An internal mixer was used to measure the torque required to mix the polymer with the fibre. The effect of the fibre content and crosslinking agent on the torque was also determined. A change was observed in the morphology of the mixes using SEM differential scanning microscopy.Keywords: WPC, DCP, LDPE, natural fibre, torque
Procedia PDF Downloads 4196525 Mercury Detection in Two Fishes from the Persian Gulf
Authors: Zahra Khoshnood, Mehdi Kazaie, Sajedeh Neisi
Abstract:
In 2013, 24 fish samples were taken from two fishery regions in the north of Persian Gulf near the Iranian coastal lines. The two flatfishes were Yellofin seabream (Acanthopagrus latus) and Longtail tuna (Thannus tonggol). We analyzed total Hg concentration of liver and muscle tissues by Mercury Analyzer (model LECO AMA 254). The average concentration of total Hg in edible Muscle tissue of deep-Flounder was measured in Bandar-Abbas and was found to be 18.92 and it was 10.19 µg.g-1 in Bandar-Lengeh. The corresponding values for Oriental sole were 8.47 and 0.08 µg.g-1. The average concentration of Hg in liver tissue of deep-Flounder, in Bandar-Abbas was 25.49 and that in Bandar-Lengeh was 12.52 µg.g-1.the values for Oriental sole were 11.88 and 3.2 µg.g-1 in Bandar-Abbas and Bandar-Lengeh, respectively.Keywords: mercury, Acanthopagrus latus, Thannus tonggol, Persian Gulf
Procedia PDF Downloads 6036524 Chemometric QSRR Evaluation of Behavior of s-Triazine Pesticides in Liquid Chromatography
Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević
Abstract:
This study considers the selection of the most suitable in silico molecular descriptors that could be used for s-triazine pesticides characterization. Suitable descriptors among topological, geometrical and physicochemical are used for quantitative structure-retention relationships (QSRR) model establishment. Established models were obtained using linear regression (LR) and multiple linear regression (MLR) analysis. In this paper, MLR models were established avoiding multicollinearity among the selected molecular descriptors. Statistical quality of established models was evaluated by standard and cross-validation statistical parameters. For detection of similarity or dissimilarity among investigated s-triazine pesticides and their classification, principal component analysis (PCA) and hierarchical cluster analysis (HCA) were used and gave similar grouping. This study is financially supported by COST action TD1305.Keywords: chemometrics, classification analysis, molecular descriptors, pesticides, regression analysis
Procedia PDF Downloads 3926523 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality
Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan
Abstract:
Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application
Procedia PDF Downloads 726522 Evolution of Floating Photovoltaic System Technology and Future Prospect
Authors: Young-Kwan Choi, Han-Sang Jeong
Abstract:
Floating photovoltaic system is a technology that combines photovoltaic power generation with floating structure. However, since floating technology has not been utilized in photovoltaic generation, there are no standardized criteria. It is separately developed and used by different installation bodies. This paper aims to discuss the change of floating photovoltaic system technology based on examples of floating photovoltaic systems installed in Korea.Keywords: floating photovoltaic system, floating PV installation, ocean floating photovoltaic system, tracking type floating photovoltaic system
Procedia PDF Downloads 5606521 Assessing Carbon Stock and Sequestration of Reforestation Species on Old Mining Sites in Morocco Using the DNDC Model
Authors: Nabil Elkhatri, Mohamed Louay Metougui, Ngonidzashe Chirinda
Abstract:
Mining activities have left a legacy of degraded landscapes, prompting urgent efforts for ecological restoration. Reforestation holds promise as a potent tool to rehabilitate these old mining sites, with the potential to sequester carbon and contribute to climate change mitigation. This study focuses on evaluating the carbon stock and sequestration potential of reforestation species in the context of Morocco's mining areas, employing the DeNitrification-DeComposition (DNDC) model. The research is grounded in recognizing the need to connect theoretical models with practical implementation, ensuring that reforestation efforts are informed by accurate and context-specific data. Field data collection encompasses growth patterns, biomass accumulation, and carbon sequestration rates, establishing an empirical foundation for the study's analyses. By integrating the collected data with the DNDC model, the study aims to provide a comprehensive understanding of carbon dynamics within reforested ecosystems on old mining sites. The major findings reveal varying sequestration rates among different reforestation species, indicating the potential for species-specific optimization of reforestation strategies to enhance carbon capture. This research's significance lies in its potential to contribute to sustainable land management practices and climate change mitigation strategies. By quantifying the carbon stock and sequestration potential of reforestation species, the study serves as a valuable resource for policymakers, land managers, and practitioners involved in ecological restoration and carbon management. Ultimately, the study aligns with global objectives to rejuvenate degraded landscapes while addressing pressing climate challenges.Keywords: carbon stock, carbon sequestration, DNDC model, ecological restoration, mining sites, Morocco, reforestation, sustainable land management.
Procedia PDF Downloads 766520 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging
Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi
Abstract:
Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA
Procedia PDF Downloads 2796519 Novel Synthesis of Metal Oxide Nanoparticles from Type IV Deep Eutectic Solvents
Authors: Lorenzo Gontrani, Marilena Carbone, Domenica Tommasa Donia, Elvira Maria Bauer, Pietro Tagliatesta
Abstract:
One of the fields where DES shows remarkable added values is the synthesis Of inorganic materials, in particular nanoparticles. In this field, the higher- ent and highly-tunable nano-homogeneities of DES structure give origin to a marked templating effect, a precious role that has led to the recent bloom of a vast number of studies exploiting these new synthesis media to prepare Nanomaterials and composite structures of various kinds. In this contribution, the most recent developments in the field will be reviewed, and some ex-citing examples of novel metal oxide nanoparticles syntheses using non-toxic type-IV Deep Eutectic Solvents will be described. The prepared materials possess nanometric dimensions and show flower-like shapes. The use of the pre- pared nanoparticles as fluorescent materials for the detection of various contaminants is under development.Keywords: metal deep eutectic solvents, nanoparticles, inorganic synthesis, type IV DES, lamellar
Procedia PDF Downloads 1356518 Location Uncertainty – A Probablistic Solution for Automatic Train Control
Authors: Monish Sengupta, Benjamin Heydecker, Daniel Woodland
Abstract:
New train control systems rely mainly on Automatic Train Protection (ATP) and Automatic Train Operation (ATO) dynamically to control the speed and hence performance. The ATP and the ATO form the vital element within the CBTC (Communication Based Train Control) and within the ERTMS (European Rail Traffic Management System) system architectures. Reliable and accurate measurement of train location, speed and acceleration are vital to the operation of train control systems. In the past, all CBTC and ERTMS system have deployed a balise or equivalent to correct the uncertainty element of the train location. Typically a CBTC train is allowed to miss only one balise on the track, after which the Automatic Train Protection (ATP) system applies emergency brake to halt the service. This is because the location uncertainty, which grows within the train control system, cannot tolerate missing more than one balise. Balises contribute a significant amount towards wayside maintenance and studies have shown that balises on the track also forms a constraint for future track layout change and change in speed profile.This paper investigates the causes of the location uncertainty that is currently experienced and considers whether it is possible to identify an effective filter to ascertain, in conjunction with appropriate sensors, more accurate speed, distance and location for a CBTC driven train without the need of any external balises. An appropriate sensor fusion algorithm and intelligent sensor selection methodology will be deployed to ascertain the railway location and speed measurement at its highest precision. Similar techniques are already in use in aviation, satellite, submarine and other navigation systems. Developing a model for the speed control and the use of Kalman filter is a key element in this research. This paper will summarize the research undertaken and its significant findings, highlighting the potential for introducing alternative approaches to train positioning that would enable removal of all trackside location correction balises, leading to huge reduction in maintenances and more flexibility in future track design.Keywords: ERTMS, CBTC, ATP, ATO
Procedia PDF Downloads 4106517 General Mathematical Framework for Analysis of Cattle Farm System
Authors: Krzysztof Pomorski
Abstract:
In the given work we present universal mathematical framework for modeling of cattle farm system that can set and validate various hypothesis that can be tested against experimental data. The presented work is preliminary but it is expected to be valid tool for future deeper analysis that can result in new class of prediction methods allowing early detection of cow dieseaes as well as cow performance. Therefore the presented work shall have its meaning in agriculture models and in machine learning as well. It also opens the possibilities for incorporation of certain class of biological models necessary in modeling of cow behavior and farm performance that might include the impact of environment on the farm system. Particular attention is paid to the model of coupled oscillators that it the basic building hypothesis that can construct the model showing certain periodic or quasiperiodic behavior.Keywords: coupled ordinary differential equations, cattle farm system, numerical methods, stochastic differential equations
Procedia PDF Downloads 1456516 Artificial Intelligence Impact on Strategic Stability
Authors: Darius Jakimavicius
Abstract:
Artificial intelligence is the subject of intense debate in the international arena, identified both as a technological breakthrough and as a component of the strategic stability effect. Both the kinetic and non-kinetic development of AI and its application in the national strategies of the great powers may trigger a change in the security situation. Artificial intelligence is generally faster, more capable and more efficient than humans, and there is a temptation to transfer decision-making and control responsibilities to artificial intelligence. Artificial intelligence, which, once activated, can select and act on targets without further intervention by a human operator, blurs the boundary between human or robot (machine) warfare, or perhaps human and robot together. Artificial intelligence acts as a force multiplier that speeds up decision-making and reaction times on the battlefield. The role of humans is increasingly moving away from direct decision-making and away from command and control processes involving the use of force. It is worth noting that the autonomy and precision of AI systems make the process of strategic stability more complex. Deterrence theory is currently in a phase of development in which deterrence is undergoing further strain and crisis due to the complexity of the evolving models enabled by artificial intelligence. Based on the concept of strategic stability and deterrence theory, it is appropriate to develop further research on the development and impact of AI in order to assess AI from both a scientific and technical perspective: to capture a new niche in the scientific literature and academic terminology, to clarify the conditions for deterrence, and to identify the potential uses, impacts and possibly quantities of AI. The research problem is the impact of artificial intelligence developed by great powers on strategic stability. This thesis seeks to assess the impact of AI on strategic stability and deterrence principles, with human exclusion from the decision-making and control loop as a key axis. The interaction between AI and human actions and interests can determine fundamental changes in great powers' defense and deterrence, and the development and application of AI-based great powers strategies can lead to a change in strategic stability.Keywords: artificial inteligence, strategic stability, deterrence theory, decision making loop
Procedia PDF Downloads 416515 Anxiety Treatment: Comparing Outcomes by Different Types of Providers
Authors: Melissa K. Hord, Stephen P. Whiteside
Abstract:
With lifetime prevalence rates ranging from 6% to 15%, anxiety disorders are among the most common childhood mental health diagnoses. Anxiety disorders diagnosed in childhood generally show an unremitting course, lead to additional psychopathology and interfere with social, emotional, and academic development. Effective evidence-based treatments include cognitive-behavioral therapy (CBT) and selective serotonin reuptake inhibitors (SSRI’s). However, if anxious children receive any treatment, it is usually through primary care, typically consists of medication, and very rarely includes evidence-based psychotherapy. Despite the high prevalence of anxiety disorders, there have only been two independent research labs that have investigated long-term results for CBT treatment for all childhood anxiety disorders and two for specific anxiety disorders. Generally, the studies indicate that the majority of youth maintain gains up to 7.4 years after treatment. These studies have not been replicated. In addition, little is known about the additional mental health care received by these patients in the intervening years after anxiety treatment, which seems likely to influence maintenance of gains for anxiety symptoms as well as the development of additional psychopathology during the subsequent years. The original sample consisted of 335 children ages 7 to 17 years (mean 13.09, 53% female) diagnosed with an anxiety disorder in 2010. Medical record review included provider billing records for mental health appointments during the five years after anxiety treatment. The subsample for this study was classified into three groups: 64 children who received CBT in an anxiety disorders clinic, 56 who received treatment from a psychiatrist, and 10 who were seen in a primary care setting. Chi-square analyses resulted in significant differences in mental health care utilization across the five years after treatment. Youth receiving treatment in primary care averaged less than one appointment each year and the appointments continued at the same rate across time. Children treated by a psychiatrist averaged approximately 3 appointments in the first two years and 2 in the subsequent three years. Importantly, youth treated in the anxiety clinic demonstrated a gradual decrease in mental health appointments across time. The nuanced differences will be presented in greater detail. The results of the current study have important implications for developing dissemination materials to help guide parents when they are selecting treatment for their children. By including all mental health appointments, this study recognizes that anxiety is often comorbid with additional diagnoses and that receiving evidence-based treatment may have long-term benefits that are associated with improvements in broader mental health. One important caveat might be that the acuity of mental health influenced the level of care sought by patients included in this study; however, taking this possibility into account, it seems those seeking care in a primary care setting continued to require similar care at the end of the study, indicating little improvement in symptoms was experienced.Keywords: anxiety, children, mental health, outcomes
Procedia PDF Downloads 2676514 The Social Ecology of Serratia entomophila: Pathogen of Costelytra giveni
Authors: C. Watson, T. Glare, M. O'Callaghan, M. Hurst
Abstract:
The endemic New Zealand grass grub (Costelytra giveni, Coleoptera: Scarabaeidae) is an economically significant grassland pest in New Zealand. Due to their impacts on production within the agricultural sector, one of New Zealand's primary industries, several methods are being used to either control or prevent the establishment of new grass grub populations in the pasture. One such method involves the use of a biopesticide based on the bacterium Serratia entomophila. This species is one of the causative agents of amber disease, a chronic disease of the larvae which results in death via septicaemia after approximately 2 to 3 months. The ability of S. entomophila to cause amber disease is dependant upon the presence of the amber disease associated plasmid (pADAP), which encodes for the key virulence determinants required for the establishment and maintenance of the disease. Following the collapse of grass grub populations within the soil, resulting from either natural population build-up or application of the bacteria, non-pathogenic plasmid-free Serratia strains begin to predominate within the soil. Whilst the interactions between S. entomophila and grass grub larvae are well studied, less information is known on the interactions between plasmid-bearing and plasmid-free strains, particularly the potential impact of these interactions upon the efficacy of an applied biopesticide. Using a range of constructed strains with antibiotic tags, in vitro (broth culture) and in vivo (soil and larvae) experiments were conducted using inoculants comprised of differing ratios of isogenic pathogenic and non-pathogenic Serratia strains, enabling the relative growth of pADAP+ and pADAP- strains under competition conditions to be assessed. In nutrient-rich, the non-pathogenic pADAP- strain outgrew the pathogenic pADAP+ strain by day 3 when inoculated in equal quantities, and by day 5 when applied as the minority inoculant, however, there was an overall gradual decline in the number of viable bacteria for both strains over a 7-day period. Similar results were obtained in additional experiments using the same strains and continuous broth cultures re-inoculated at 24-hour intervals, although in these cultures, the viable cell count did not diminish over the 7-day period. When the same ratios were assessed in soil microcosms with limited available nutrients, the strains remained relatively stable over a 2-month period. Additionally, in vivo grass grub co-infections assays using the same ratios of tagged Serratia strains revealed similar results to those observed in the soil, but there was also evidence of horizontal transfer of pADAP from the pathogenic to the non-pathogenic strain within the larval gut after a period of 4 days. Whilst the influence of competition is more apparent in broth cultures than within the soil or larvae, further testing is required to determine whether this competition between pathogenic and non-pathogenic Serratia strains has any influence on efficacy and disease progression, and how this may impact on the ability of S. entomophila to cause amber disease within grass grub larvae when applied as a biopesticide.Keywords: biological control, entomopathogen, microbial ecology, New Zealand
Procedia PDF Downloads 1566513 An Algorithm for Removal of Noise from X-Ray Images
Authors: Sajidullah Khan, Najeeb Ullah, Wang Yin Chai, Chai Soo See
Abstract:
In this paper, we propose an approach to remove impulse and Poisson noise from X-ray images. Many filters have been used for impulse noise removal from color and gray scale images with their own strengths and weaknesses but X-ray images contain Poisson noise and unfortunately there is no intelligent filter which can detect impulse and Poisson noise from X-ray images. Our proposed filter uses the upgraded layer discrimination approach to detect both Impulse and Poisson noise corrupted pixels in X-ray images and then restores only those detected pixels with a simple efficient and reliable one line equation. Our Proposed algorithms are very effective and much more efficient than all existing filters used only for Impulse noise removal. The proposed method uses a new powerful and efficient noise detection method to determine whether the pixel under observation is corrupted or noise free. Results from computer simulations are used to demonstrate pleasing performance of our proposed method.Keywords: X-ray image de-noising, impulse noise, poisson noise, PRWF
Procedia PDF Downloads 3836512 Remote Patient Monitoring for Covid-19
Authors: Launcelot McGrath
Abstract:
The Coronavirus disease 2019 (COVID-19) has spread rapidly around the world, resulting in high mortality rates and very large numbers of people requiring medical treatment in ICU. Management of patient hospitalisation is a critical aspect to control this disease and reduce chaos in the healthcare systems. Remote monitoring provides a solution to protect vulnerable and elderly high-risk patients. Continuous remote monitoring of oxygen saturation, respiratory rate, heart rate, and temperature, etc., provides medical systems with up-to-the-minute information about their patients' statuses. Remote monitoring also limits the spread of infection by reducing hospital overcrowding. This paper examines the potential of remote monitoring for Covid-19 to assist in the rapid identification of patients at risk, facilitate the detection of patient deterioration, and enable early interventions.Keywords: remote monitoring, patient care, oxygen saturation, Covid-19, hospital management
Procedia PDF Downloads 1086511 Local Governments Supporting Environmentally Sustainable Meals to Protect the Planet and People
Authors: Magdy Danial Riad
Abstract:
Introduction: The ability of our world to support the expanding population after 2050 is at risk due to the food system's global role in poor health, climate change, and resource depletion. Healthy, equitable, and sustainable food systems must be achieved from the point of production through consumption in order to meet several of the sustainable development goals (SDG) targets. There is evidence that changing the local food environment can effectively change dietary habits in a community. The purpose of this article is to outline the policy initiatives taken by local governments to support environmentally friendly eating habits. Methods: Five databases were searched for peer-reviewed articles that described local government authorities' implementation of environmentally sustainable eating habits, were located in cities that had signed the Milan Urban Food Policy Pact, were published after 2015, were available in English, and described policy interventions. Data extraction was a two-step approach that started with extracting information from the included study and ended with locating information unique to policies in the grey literature. Results: 45 papers that described a variety of policy initiatives from low-, middle-, and high-income countries met the inclusion criteria. A variety of desired dietary behaviors were the focus of policy action, including reducing food waste, procuring food locally and in season, boosting breastfeeding, avoiding overconsumption, and consuming more plant-based meals and fewer items derived from animals. Conclusions: In order to achieve SDG targets, local governments are under pressure to implement evidence-based interventions. This study can help direct local governments toward evidence-based policy measures to improve regional food systems and support ecologically friendly eating habits.Keywords: meals, planet, poor health, eating habits
Procedia PDF Downloads 52