Search results for: Advanced Encryption Standard (AES)
6825 Dry Friction Fluctuations in Plain Journal Bearings
Authors: James Moran, Anusarn Permsuwan
Abstract:
This paper compares oscillations in the dry friction coefficient in different journal bearings. Measurements are made of the average and standard deviation in the coefficient of friction as a function of sliding velocity. The standard deviation of the friction coefficient changed dramatically with sliding velocity. The magnitude and frequency of the oscillations were a function of the velocity. A numerical model was developed for the frictional oscillations. There was good agreement between the model and results. Five different materials were used as the sliding surfaces in the experiments, Aluminum, Bronze, Mild Steel, Stainless Steel, and Nylon.Keywords: Coulomb friction, dynamic friction, non-lubricated bearings, frictional oscillations
Procedia PDF Downloads 3646824 Prognostic Value of Tumor Markers in Younger Patients with Breast Cancer
Authors: Lola T. Alimkhodjaeva, Lola T. Zakirova, Soniya S. Ziyavidenova
Abstract:
Background: Breast cancer occupies the first place among the cancer in women in the world. It is urgent today to study the role of molecular markers which are capable of predicting the dynamics and outcome of the disease. The aim of this study is to define the prognostic value of the content of estrogen receptor (ER), progesterone receptor (PgR), and amplification of HER-2 / neu oncoprotein by studying 3 and 5-year overall and relapse-free survival in 470 patients with primary operable and 280 patients with locally–advanced breast cancer. Materials and methods: Study results of 3 and 5-year overall and relapse-free survival, depending on the content of RE, PgR in primary operable patients showed that ER positive (+) and PgR (+) survival was 100 (96.2%) and 97.3 (94.6%), for ER negative (-) and PgR (-) - 69.2 (60.3%) and 65.4 (57.7%), for ER positive (+) and negative PgR (-) 87.4 (80.1%) and 81.5 (79.3%), for ER negative (-) and positive PgR (+) - 97.4 (93.4%) and 90.4 (88.5%), respectively. Survival results depended also on the level of HER-2 / neu expression. In patients with HER-2 / neu negative the survival rates were as follows: 98.6 (94.7%) and 96.2 (92.3%). In group of patients with the level of HER-2 / neu (2+) expression these figures were: 45.3 (44.3%) and 45.1 (40.2%), and in group of patients with the level of HER-2 / neu (3+) expression - 41.2 (33.1%) and 34.3 (29.4%). The combination of ER negative (-), PgR (-), HER-2 / neu (-) they were 27.2 (25.4%) and 19.5 (15.3%), respectively. In patients with locally-advanced breast cancer the results of 3 and 5-year OS and RFS for ER (+) and PgR (+) were 76.3 (69.3%) and 62.2 (61.4%), for ER (-) and RP (-) 29.1 (23.7%) and 18.3 (12.6%), for ER (+) and PgR (-) 61.2 (47.2%) and 39.4 (25.6%), for ER (-) and PgR (+) 54.3 (43.1%) and 41.3 (18.3%), respectively. The level of HER-2 / neu expression also affected the survival results. Therefore, in HER-2/ neu negative patients the survival rate was 74.1 (67.6%) and 65.1 (57.3%), with the level of expression (2+) 20.4 (14.2%) and 8.6 (6.4%), with the level of expression (3+) 6.2 (3.1%) and 1.2 (1.5%), respectively. The combination for ER, PgR, HER-2 / neu negative was 22.1 (14.3%) and 8.4 (1.2%). Conclusion: Thus, the presence of steroid hormone receptors in breast tumor tissues at primary operable and locally- advanced process as the lack of HER-2/neu oncoprotein correlates with the highest rates of 3- and 5-year overall and relapse-free survival. The absence of steroid hormone receptors as well as of HER-2/neu overexpression in malignant breast tissues significantly degrades the 3- and 5-year overall and relapse-free survival. Tumors with ER, PgR and HER-2/neu negative have the most unfavorable prognostics.Keywords: breast cancer, estrogen receptor, oncoprotein, progesterone receptor
Procedia PDF Downloads 1886823 Analysis of the IEEE 802.15.4 MAC Parameters to Achive Lower Packet Loss Rates
Authors: Imen Bouazzi
Abstract:
The IEEE-802.15.4 standard utilizes the CSMA-CA mechanism to control nodes access to the shared wireless communication medium. It is becoming the popular choice for various applications of surveillance and control used in wireless sensor network (WSN). The benefit of this standard is evaluated regarding of the packet loss probability who depends on the configuration of IEEE 802.15.4 MAC parameters and the traffic load. Our exigency is to evaluate the effects of various configurable MAC parameters on the performance of beaconless IEEE 802.15.4 networks under different traffic loads, static values of IEEE 802.15.4 MAC parameters (macMinBE, macMaxCSMABackoffs, and macMaxFrame Retries) will be evaluated. To performance analysis, we use ns-2[2] network simulator.Keywords: WSN, packet loss, CSMA/CA, IEEE-802.15.4
Procedia PDF Downloads 3396822 Management and Agreement Protocol in Computer Security
Authors: Abdulameer K. Hussain
Abstract:
When dealing with a cryptographic system we note that there are many activities performed by parties of this cryptographic system and the most prominent of these activities is the process of agreement between the parties involved in the cryptographic system on how to deal and perform the cryptographic system tasks to be more secure, more confident and reliable. The most common agreement among parties is a key agreement and other types of agreements. Despite the fact that there is an attempt from some quarters to find other effective agreement methods but these methods are limited to the traditional agreements. This paper presents different parameters to perform more effectively the task of the agreement, including the key alternative, the agreement on the encryption method used and the agreement to prevent the denial of the services. To manage and achieve these goals, this method proposes the existence of an control and monitoring entity to manage these agreements by collecting different statistical information of the opinions of the authorized parties in the cryptographic system. These statistics help this entity to take the proper decision about the agreement factors. This entity is called Agreement Manager (AM).Keywords: agreement parameters, key agreement, key exchange, security management
Procedia PDF Downloads 4196821 Use of Natural Fibers in Landfill Leachate Treatment
Authors: Araujo J. F. Marina, Araujo F. Marcus Vinicius, Mulinari R. Daniella
Abstract:
Due to the resultant leachate from waste decomposition in landfills has polluter potential hundred times greater than domestic sewage, this is considered a problem related to the depreciation of environment requiring pre-disposal treatment. In seeking to improve this situation, this project proposes the treatment of landfill leachate using natural fibers intercropped with advanced oxidation processes. The selected natural fibers were palm, coconut and banana fiber. These materials give sustainability to the project because, besides having adsorbent capacity, are often part of waste discarded. The study was conducted in laboratory scale. In trials, the effluents were characterized as Chemical Oxygen Demand (COD), Turbidity and Color. The results indicate that is technically promising since that there were extremely oxidative conditions, the use of certain natural fibers in the reduction of pollutants in leachate have been obtained results of COD removals between 67.9% and 90.9%, Turbidity between 88.0% and 99.7% and Color between 67.4% and 90.4%. The expectation generated is to continue evaluating the association of efficiency of other natural fibers with other landfill leachate treatment processes.Keywords: lndfill leachate, chemical treatment, natural fibers, advanced oxidation processes
Procedia PDF Downloads 3556820 Evaluation of Corrosion Behaviour of Coatings Applied in a High-Strength Low Alloy Steel in Different Climatic Cabinets
Authors: Raquel Bayon, Ainara Lopez-Ortega, Elena Rodriguez, Amaya Igartua
Abstract:
Corrosion is one of the most concerning phenomenon that accelerates material degradation in offshore applications. In order to avoid the premature failure of metallic materials in marine environments, organic coatings have widely been used, due to their elevated corrosion resistance. Thermally-sprayed metals have recently been used in offshore applications, whereas ceramic materials are usually less employed, due to their high cost. The protectiveness of the coatings can be evaluated and categorized in corrosivity categories in accordance with the ISO 12944-6 Standard. According to this standard, for coatings that are supposed to work in marine environments, a C5-M category is required for components working out of the water or partially immersed in the splash zone, and an Im2 category for totally immersed components. C5-M/Im-2 high category would correspond to a durability of more than 20 years without maintenance in accordance with ISO 12944 and NORSOK M501 standards. In this work, the corrosion behavior of three potential coatings used in offshore applications has been evaluated. For this aim, the materials have been subjected to different environmental conditions in several climatic chambers (humidostatic, climatic, immersion, UV and salt-fog). The category of the coatings to each condition has been selected, in accordance with the previously mentioned standard.Keywords: cabinet, coatings, corrosion, offshore
Procedia PDF Downloads 4196819 3D Reconstruction of Human Body Based on Gender Classification
Authors: Jiahe Liu, Hongyang Yu, Feng Qian, Miao Luo
Abstract:
SMPL-X was a powerful parametric human body model that included male, neutral, and female models, with significant gender differences between these three models. During the process of 3D human body reconstruction, the correct selection of standard templates was crucial for obtaining accurate results. To address this issue, we developed an efficient gender classification algorithm to automatically select the appropriate template for 3D human body reconstruction. The key to this gender classification algorithm was the precise analysis of human body features. By using the SMPL-X model, the algorithm could detect and identify gender features of the human body, thereby determining which standard template should be used. The accuracy of this algorithm made the 3D reconstruction process more accurate and reliable, as it could adjust model parameters based on individual gender differences. SMPL-X and the related gender classification algorithm have brought important advancements to the field of 3D human body reconstruction. By accurately selecting standard templates, they have improved the accuracy of reconstruction and have broad potential in various application fields. These technologies continue to drive the development of the 3D reconstruction field, providing us with more realistic and accurate human body models.Keywords: gender classification, joint detection, SMPL-X, 3D reconstruction
Procedia PDF Downloads 686818 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System
Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin
Abstract:
RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.Keywords: cluster system, modular exponentiation, sliding window, addition chain
Procedia PDF Downloads 5196817 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation
Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim
Abstract:
In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement
Procedia PDF Downloads 1166816 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard
Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane
Abstract:
This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard
Procedia PDF Downloads 2956815 An Experimental Investigation of the Effect of Control Algorithm on the Energy Consumption and Temperature Distribution of a Household Refrigerator
Authors: G. Peker, Tolga N. Aynur, E. Tinar
Abstract:
In order to determine the energy consumption level and cooling characteristics of a domestic refrigerator controlled with various cooling system algorithms, a side by side type (SBS) refrigerator was tested in temperature and humidity controlled chamber conditions. Two different control algorithms; so-called drop-in and frequency controlled variable capacity compressor algorithms, were tested on the same refrigerator. Refrigerator cooling characteristics were investigated for both cases and results were compared with each other. The most important comparison parameters between the two algorithms were taken as; temperature distribution, energy consumption, evaporation and condensation temperatures, and refrigerator run times. Standard energy consumption tests were carried out on the same appliance and resulted in almost the same energy consumption levels, with a difference of %1,5. By using these two different control algorithms, the power consumptions character/profile of the refrigerator was found to be similar. By following the associated energy measurement standard, the temperature values of the test packages were measured to be slightly higher for the frequency controlled algorithm compared to the drop-in algorithm. This paper contains the details of this experimental study conducted with different cooling control algorithms and compares the findings based on the same standard conditions.Keywords: control algorithm, cooling, energy consumption, refrigerator
Procedia PDF Downloads 3706814 Photoleap: An AI-Powered Photo Editing App with Advanced Features and User Satisfaction Analysis
Authors: Joud Basyouni, Rama Zagzoog, Mashael Al Faleh, Jana Alireza
Abstract:
AI is changing many fields and speeding up tasks that used to take a long time. It used to take too long to edit photos. However, many AI-powered apps make photo editing, automatic effects, and animations much easier than other manual editing apps with no AI. The mobile app Photoleap edits photos and creates digital art using AI. Editing photos with text prompts is also becoming a standard these days with the help of apps like Photoleap. Now, users can change backgrounds, add animations, turn text into images, and create scenes with AI. This project report discusses the photo editing app's history and popularity. Photoleap resembles Photoshop, Canva, Photos, and Pixlr. The report includes survey questions to assess Photoleap user satisfaction. The report describes Photoleap's features and functions with screenshots. Photoleap uses AI well. Charts and graphs show Photoleap user ratings and comments from the survey. This project found that most Photoleap users liked how well it worked, was made, and was easy to use. People liked changing photos and adding backgrounds. Users can create stunning photo animations. A few users dislike the app's animations, AI art, and photo effects. The project report discusses the app's pros and cons and offers improvements.Keywords: artificial intelligence, photoleap, images, background, photo editing
Procedia PDF Downloads 586813 The Gold Standard Treatment Plan for Vitiligo: A Review on Conventional and Updated Treatment Methods
Authors: Kritin K. Verma, Brian L. Ransdell
Abstract:
White patches are a symptom of vitiligo, a chronic autoimmune dermatological condition that causes a loss of pigmentation in the skin. Vitiligo can cause issues of self-esteem and quality of life while also progressing the development of other autoimmune diseases. Current treatments in allopathy and homeopathy exist; some treatments have been found to be toxic, whereas others have been helpful. Allopathy has seemed to offer several treatment plans, such as phototherapy, skin lightening preparations, immunosuppressive drugs, combined modality therapy, and steroid medications to improve vitiligo. This presentation will review the FDA-approved topical cream, Opzelura, a JAK inhibitor, and its effects on limiting vitiligo progression. Meanwhile, other non-conventional methods, such as Arsenic Sulphuratum Flavum used in homeopathy, will be debunked based on current literature. Most treatments still serve to arrest progression and induce skin repigmentation. Treatment plans may differ between patients due to depigmentation location on the skin. Since there is no gold standard plan for treating patients with vitiligo, the oral presentation will review all topical and systemic pharmacological therapies that fight the depigmentation of the skin and categorize their validity from a systematic review of the literature. Since treatment plans are limited in nature, all treatment methods will be mentioned and an attempt will be made to make a golden standard treatment process for these patients.Keywords: vitiligo, phototherapy, immunosuppressive drugs, skin lightening preparations, combined modality therapy, arsenic sulphuratum flavum, homeopathy, allopathy, golden standard, Opzelura
Procedia PDF Downloads 856812 A 5G Architecture Based to Dynamic Vehicular Clustering Enhancing VoD Services Over Vehicular Ad hoc Networks
Authors: Lamaa Sellami, Bechir Alaya
Abstract:
Nowadays, video-on-demand (VoD) applications are becoming one of the tendencies driving vehicular network users. In this paper, considering the unpredictable vehicle density, the unexpected acceleration or deceleration of the different cars included in the vehicular traffic load, and the limited radio range of the employed communication scheme, we introduce the “Dynamic Vehicular Clustering” (DVC) algorithm as a new scheme for video streaming systems over VANET. The proposed algorithm takes advantage of the concept of small cells and the introduction of wireless backhauls, inspired by the different features and the performance of the Long Term Evolution (LTE)- Advanced network. The proposed clustering algorithm considers multiple characteristics such as the vehicle’s position and acceleration to reduce latency and packet loss. Therefore, each cluster is counted as a small cell containing vehicular nodes and an access point that is elected regarding some particular specifications.Keywords: video-on-demand, vehicular ad-hoc network, mobility, vehicular traffic load, small cell, wireless backhaul, LTE-advanced, latency, packet loss
Procedia PDF Downloads 1396811 Frequency of Nosocomial Infections in a Tertiary Hospital in Isfahan, Iran
Authors: Zahra Tolou-Ghamari
Abstract:
Objective: Health care associated with multiresistant pathogens is rising globally. It is well known that nosocomial infections increase hospital stay, morbidity, mortality, and disability. Therefore, the aim of this study was to define the occurrence of nosocomial infections in a tertiary hospital in Isfahan/Iran. Materials and Methods: The data were extracted from the official database of hospital nosocomial infections records that included 9152 vertical rows. For each patient, the reported infections were coded by number as UTI-SUTI; Code 55, VAE-PVAP; Code 56, BSI-LCBI Code 19, SSI-DIP; Code 14, and so on. For continuous variables, mean ± standard deviation and for categorical variables, the frequency was used. Results: The study population was 5542 patients, comprised of males (n=3282) and females (n=2260). With a minimum of 15 and a maximum of 99, the mean age in 5313 patients was 58.5 ± 19.1 years old. The highest reported nosocomial infections (n= 77%) were associated with the ages 30-80 years old. Sites of nosocomial infections in 87% were as: VAE-PVAP; 27.3%, VAE-IVAC; 7.7, UTI-SUTI; 29.5%, BSI-LCBI; 12.9%, SSI-DIP; 9.5% and other individual infection (13%) with the main pathogens klebsiella pneumonia, acinetobacter baumannii and staphylococcus. Conclusions: For an efficient surveillance system, adopting pharmacotherapy used antibiotics in terms of monotherapy or polypharmacy control policy, in addition to advanced infection control programs at regional and national levels in Iran recommended.Keywords: infection, nosocomial, ventilator, blood stream, Isfahan, Iran
Procedia PDF Downloads 776810 Assessment of Five Photoplethysmographic Methods for Estimating Heart Rate Variability
Authors: Akshay B. Pawar, Rohit Y. Parasnis
Abstract:
Heart Rate Variability (HRV) is a widely used indicator of the regulation between the autonomic nervous system (ANS) and the cardiovascular system. Besides being non-invasive, it also has the potential to predict mortality in cases involving critical injuries. The gold standard method for determining HRV is based on the analysis of RR interval time series extracted from ECG signals. However, because it is much more convenient to obtain photoplethysmogramic (PPG) signals as compared to ECG signals (which require the attachment of several electrodes to the body), many researchers have used pulse cycle intervals instead of RR intervals to estimate HRV. They have also compared this method with the gold standard technique. Though most of their observations indicate a strong correlation between the two methods, recent studies show that in healthy subjects, except for a few parameters, the pulse-based method cannot be a surrogate for the standard RR interval- based method. Moreover, the former tends to overestimate short-term variability in heart rate. This calls for improvements in or alternatives to the pulse-cycle interval method. In this study, besides the systolic peak-peak interval method (PP method) that has been studied several times, four recent PPG-based techniques, namely the first derivative peak-peak interval method (P1D method), the second derivative peak-peak interval method (P2D method), the valley-valley interval method (VV method) and the tangent-intersection interval method (TI method) were compared with the gold standard technique. ECG and PPG signals were obtained from 10 young and healthy adults (consisting of both males and females) seated in the armchair position. In order to de-noise these signals and eliminate baseline drift, they were passed through certain digital filters. After filtering, the following HRV parameters were computed from PPG using each of the five methods and also from ECG using the gold standard method: time domain parameters (SDNN, pNN50 and RMSSD), frequency domain parameters (Very low-frequency power (VLF), Low-frequency power (LF), High-frequency power (HF) and Total power or “TP”). Besides, Poincaré plots were also plotted and their SD1/SD2 ratios determined. The resulting sets of parameters were compared with those yielded by the standard method using measures of statistical correlation (correlation coefficient) as well as statistical agreement (Bland-Altman plots). From the viewpoint of correlation, our results show that the best PPG-based methods for the determination of most parameters and Poincaré plots are the P2D method (shows more than 93% correlation with the standard method) and the PP method (mean correlation: 88%) whereas the TI, VV and P1D methods perform poorly (<70% correlation in most cases). However, our evaluation of statistical agreement using Bland-Altman plots shows that none of the five techniques agrees satisfactorily well with the gold standard method as far as time-domain parameters are concerned. In conclusion, excellent statistical correlation implies that certain PPG-based methods provide a good amount of information on the pattern of heart rate variation, whereas poor statistical agreement implies that PPG cannot completely replace ECG in the determination of HRV.Keywords: photoplethysmography, heart rate variability, correlation coefficient, Bland-Altman plot
Procedia PDF Downloads 3226809 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning
Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu
Abstract:
Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.
Procedia PDF Downloads 886808 Performance of Total Vector Error of an Estimated Phasor within Local Area Networks
Authors: Ahmed Abdolkhalig, Rastko Zivanovic
Abstract:
This paper evaluates the Total Vector Error of an estimated Phasor as define in IEEE C37.118 standard within different medium access in Local Area Networks (LAN). Three different LAN models (CSMA/CD, CSMA/AMP, and Switched Ethernet) are evaluated. The Total Vector Error of the estimated Phasor has been evaluated for the effect of Nodes Number under the standardized network Band-width values defined in IEC 61850-9-2 communication standard (i.e. 0.1, 1, and 10 Gbps).Keywords: phasor, local area network, total vector error, IEEE C37.118, IEC 61850
Procedia PDF Downloads 3096807 Modeling and Simulation of InAs/GaAs and GaSb/GaAS Quantum Dot Solar Cells in SILVACO TCAD
Authors: Fethi Benyettou, Abdelkader Aissat, M. A. Benammar
Abstract:
In this work, we use Silvaco TCAD software for modeling and simulations of standard GaAs solar cell, InAs/GaAs and GaSb/GaAs p-i-n quantum dot solar cell. When comparing 20-layer InAs/GaAs, GaSb/GaAs quantum dots solar cells with standard GaAs solar cell, the conversion efficiency in simulation results increased from 16.48 % to 22.6% and 16.48% to 22.42% respectively. Also, the absorption range edge of photons with low energies extended from 900 nm to 1200 nm.Keywords: SILVACO TCAD, the quantum dot, simulation, materials engineering
Procedia PDF Downloads 5006806 Early Identification and Early Intervention: Pre and Post Diagnostic Tests in Mathematics Courses
Authors: Kailash Ghimire, Manoj Thapa
Abstract:
This study focuses on early identification of deficiencies in pre-required areas of students who are enrolled in College Algebra and Calculus I classes. The students were given pre-diagnostic tests on the first day of the class before they are provided with the syllabus. The tests consist of prerequisite, uniform and advanced content outlined by the University System of Georgia (USG). The results show that 48% of students in College Algebra are lacking prerequisite skills while 52% of Calculus I students are lacking prerequisite skills but, interestingly these students are prior exposed to uniform content and advanced content. The study is still in progress and this paper contains the outcome from Fall 2017 and Spring 2018. In this paper, early intervention used in these classes: two days vs three days meeting a week and students’ self-assessment using exam wrappers and their effectiveness on students’ learning will also be discussed. A result of this study shows that there is an improvement on Drop, Fail and Withdraw (DFW) rates by 7%-10% compared to those in previous semesters.Keywords: student at risk, diagnostic tests, identification, intervention, normalization gain, validity of tests
Procedia PDF Downloads 2076805 Determination of Cadmium and Lead in Sewage Sludge from the Middle Region (Misrata, Msallata and Tarhünah Cities) of Libya
Authors: J. A. Mayouf, Q. A. Najim, H. S. Al-Bayati
Abstract:
The concentrations of cadmium and lead in sewage sludge samples were determined by Atomic Absorption Spectrometric Method. Samples of sewage sludge were obtained from three sewage treatment plants localised in Middle Region of Libya (Misrata, Msallata and Tarhünah cities). The results shows that, the mean levels of Cadmium for all regions are ranges from 81 to 123.4 ppm and these values are higher than the limitations for the international standard which are not registered more than 50 ppm (dry weight) in USA, Egypt and the EU countries. While, the lead concentrations are ranged from 8.0 to 189.2 ppm and all values are within the standard limits which graduated between (275–613) ppm.Keywords: cadmium, lead, sewage, spectrometry
Procedia PDF Downloads 3626804 A CFD Analysis of Flow through a High-Pressure Natural Gas Pipeline with an Undeformed and Deformed Orifice Plate
Authors: R. Kiš, M. Malcho, M. Janovcová
Abstract:
This work aims to present a numerical analysis of the natural gas which flows through a high-pressure pipeline and an orifice plate, through the use of CFD methods. The paper contains CFD calculations for the flow of natural gas in a pipe with different geometry used for the orifice plates. One of them has a standard geometry and a shape without any deformation and the other is deformed by the action of the pressure differential. It shows the behaviour of natural gas in a pipeline using the velocity profiles and pressure fields of the gas in both models with their differences. The entire research is based on the elimination of any inaccuracy which should appear in the flow of the natural gas measured in the high-pressure pipelines of the gas industry and which is currently not given in the relevant standard.Keywords: orifice plate, high-pressure pipeline, natural gas, CFD analysis
Procedia PDF Downloads 3776803 A Review of Recent Studies on Advanced Technologies for Water Treatment
Authors: Deniz Sahin
Abstract:
Growing concern for the presence and contamination of heavy metals in our water supplies has steadily increased over the last few years. A number of specialized technologies including precipitation, coagulation/flocculation, ion exchange, cementation, electrochemical operations, have been developed for the removal of heavy metals from wastewater. However, these technologies have many limitations in the application, such as high cost, low separation efficiency, Recently, numerous approaches have been investigated to overcome these difficulties and membrane filtration, advanced oxidation technologies (AOPs), and UV irradiation etc. are sufficiently developed to be considered as alternative treatments. Many factors come into play when selecting wastewater treatment technology, such as type of wastewater, operating conditions, economics etc. This study describes these various treatment technologies employed for heavy metal removal. Advantages and disadvantages of these technologies are also compared to highlight their current limitations and future research needs. For example, we investigated the applicability of the ultrafiltration technology for treating of heavy metal ions (e.g., Cu(II), Pb(II), Cd(II), Zn(II)) from synthetic wastewater solutions. Results shown that complete removal of metal ions, could be achieved.Keywords: heavy metal, treatment methodologies, water, water treatment
Procedia PDF Downloads 1686802 Review Paper on an Algorithm Enhancing Privacy and Security in Online Meeting Platforms Using a Secured Encryption
Authors: Tonderai Muchenje, Mkhatshwa Phethile
Abstract:
Humans living in this current situation know that communication with one another is necessary for themselves. There are many ways to communicate with each other; during unexpected natural disasters and outbreak of epidemics and pandemics, the need for online meeting platforms are considered most important. Apparently, the development in the telecommunication sector also played an important role. Therefore, the epidemic of the Covid-19 Pandemic and the new normal situation resulted in the overwhelming production of online meeting platforms to prevent the situation. This software is commonly used in business communications in the beginning. Rapidly the COVID-19 pandemic changed the situation. At present-day, these virtual meeting applications are not only used to have informal meetings with friends and relatives but also to be used to have formal meetings in the business and education (universities) sector. In this article, an attempt has been made to list out the useful secured ways for using online meeting platforms.Keywords: virtual background, zoom, secure online algorithm, RingCentral, Pexip Pexip, TeamViewer, microsoft teams
Procedia PDF Downloads 1136801 Micro-Scale Digital Image Correlation-Driven Finite Element Simulations of Deformation and Damage Initiation in Advanced High Strength Steels
Authors: Asim Alsharif, Christophe Pinna, Hassan Ghadbeigi
Abstract:
The development of next-generation advanced high strength steels (AHSS) used in the automotive industry requires a better understanding of local deformation and damage development at the scale of their microstructures. This work is focused on dual-phase DP1000 steels and involves micro-mechanical tensile testing inside a scanning electron microscope (SEM) combined with digital image correlation (DIC) to quantify the heterogeneity of deformation in both ferrite and martensite and its evolution up to fracture. Natural features of the microstructure are used for the correlation carried out using Davis LaVision software. Strain localization is observed in both phases with tensile strain values up to 130% and 110% recorded in ferrite and martensite respectively just before final fracture. Damage initiation sites have been observed during deformation in martensite but could not be correlated to local strain values. A finite element (FE) model of the microstructure has then been developed using Abaqus to map stress distributions over representative areas of the microstructure by forcing the model to deform as in the experiment using DIC-measured displacement maps as boundary conditions. A MATLAB code has been developed to automatically mesh the microstructure from SEM images and to map displacement vectors from DIC onto the FE mesh. Results show a correlation of damage initiation at the interface between ferrite and martensite with local principal stress values of about 1700MPa in the martensite phase. Damage in ferrite is now being investigated, and results are expected to bring new insight into damage development in DP steels.Keywords: advanced high strength steels, digital image correlation, finite element modelling, micro-mechanical testing
Procedia PDF Downloads 1446800 Biomarkers for Rectal Adenocarcinoma Identified by Lipidomic and Bioinformatic
Authors: Patricia O. Carvalho, Marcia C. F. Messias, Laura Credidio, Carlos A. R. Martinez
Abstract:
Lipidomic strategy can provide important information regarding cancer pathogenesis mechanisms and could reveal new biomarkers to enable early diagnosis of rectal adenocarcinoma (RAC). This study set out to evaluate lipoperoxidation biomarkers, and lipidomic signature by gas chromatography (GC) and electrospray ionization-qToF-mass spectrometry (ESI-qToF-MS) combined with multivariate data analysis in plasma from 23 RAC patients (early- or advanced-stages cancer) and 18 healthy controls. The most abundant ions identified in the RAC patients were those of phosphatidylcholine (PC) and phosphatidylethanolamine (PE) while those of lisophosphatidylcholine (LPC), identified as LPC (16:1), LPC (18:1) and LPC (18:2), were down-regulated. LPC plasmalogen containing palmitoleic acid (LPC (P-16:1)), with highest VIP score, showed a low tendency in the cancer patients. Malondialdehyde plasma levels were higher in patients with advanced cancer (III/IV stages) than in the early stages groups and the healthy group (p<0.05). No differences in F2-isoprostane levels were observed between these groups. This study shows that the reduction in plasma levels of LPC plasmalogens associated to an increase in MDA levels may indicate increased oxidative stress in these patients and identify the metabolite LPC (P-16:1) as new biomarkers for RAC.Keywords: biomarkers, lipidomic, plasmalogen, rectal adenocarcinoma
Procedia PDF Downloads 2286799 Accounting for Cryptocurrency: Urgent Need for an Accounting Standard
Authors: Fatima Ali Abbass, Hassan Ibrahim Rkein
Abstract:
The number of entities worldwide that currently accept digital currency as payment is increasing; however, digital currency still is not widely accepted as a medium of exchange, nor they represent legal tender. At the same time, this makes accounting for cryptocurrency, as cash (Currency) is not possible under IAS 7 and IAS 32, Cryptocurrency also cannot be accounted for as Financial Assets at fair value through profit or loss under IFRS 9. Therefore, this paper studies the possible means to account for Cryptocurrency, since, as of today, there is not yet an accounting standard that deals with cryptocurrency. The request to have a specific accounting standard is increasing from top accounting firms and from professional accounting bodies. This study uses a mixture of qualitative and quantitative analysis in its quest to explore the best possible way to account for cryptocurrency. Interviews and surveys were conducted targeting accounting professionals. This study highlighted the deficiencies in the current way of accounting for Cryptocurrency as intangible Assets with an indefinite life. The deficiency becomes well highlighted, as the asset will then be subject to impairment, where under GAAP, only depreciation in the value of the intangible asset is recognized. On the other hand, appreciation in the value of the asset is ignored, and this prohibits the reporting entity from showing the true value of the cryptocurrency asset. This research highlights the gap that arises due to using accounting standards that are not specific for Cryptocurrency and this study confirmed that there is an urgent need to call upon the accounting standards setters (IASB and FASB) to issue accounting standards specifically for Cryptocurrency.Keywords: cryptocurrency, accounting, IFRS, GAAP, classification, measurement
Procedia PDF Downloads 956798 Exploring Tweet Geolocation: Leveraging Large Language Models for Post-Hoc Explanations
Authors: Sarra Hasni, Sami Faiz
Abstract:
In recent years, location prediction on social networks has gained significant attention, with short and unstructured texts like tweets posing additional challenges. Advanced geolocation models have been proposed, increasing the need to explain their predictions. In this paper, we provide explanations for a geolocation black-box model using LIME and SHAP, two state-of-the-art XAI (eXplainable Artificial Intelligence) methods. We extend our evaluations to Large Language Models (LLMs) as post hoc explainers for tweet geolocation. Our preliminary results show that LLMs outperform LIME and SHAP by generating more accurate explanations. Additionally, we demonstrate that prompts with examples and meta-prompts containing phonetic spelling rules improve the interpretability of these models, even with informal input data. This approach highlights the potential of advanced prompt engineering techniques to enhance the effectiveness of black-box models in geolocation tasks on social networks.Keywords: large language model, post hoc explainer, prompt engineering, local explanation, tweet geolocation
Procedia PDF Downloads 246797 Exploring the Biocompatibility and Performance of Metals and Ceramics as Biomaterials, A Comprehensive Study for Advanced Medical Applications
Authors: Ala Abobakr Abdulhafidh Al-Dubai
Abstract:
Biomaterials, specifically metals and ceramics, are indispensable components in the realm of medical science, shaping the landscape of implantology and prosthetics. This study delves into the intricate interplay between these materials and biological systems, aiming to scrutinize their suitability, performance, and biocompatibility. Employing a multi-faceted approach, a range of methodologies were meticulously employed to comprehensively characterize these biomaterials. Advanced material characterization techniques were paramount in this research, with scanning electron microscopy providing intricate insights into surface morphology, and X-ray diffraction unraveling the crystalline structures. These analyses were complemented by in vitro assessments, which gauged the biological response of cells to metals and ceramics, shedding light on their potential applications within the human body. A key facet of our investigation involved a comparative study, evaluating the corrosion resistance and osseointegration potential of both metals and ceramics. Through a series of experiments, we sought to understand how these biomaterials interacted with physiological environments, paving the way for informed decisions in medical applicationsKeywords: metals, ceramics, biomaterials, biocompatibility, osseointegration
Procedia PDF Downloads 676796 Effects of Air Pollution on Dew Water: A Case Study of Ado-Ekiti, Nigeria
Authors: M. Sanmi Awopetu, Olugbenga Aribisala, Olabisi O. Ologuntoye, S. Olumuyi Akindele
Abstract:
Human existence vis-à-vis its environment is more and more getting a threatened sequel to air pollution occasioned majorly by human coupled with natural activities. Earth is getting warmer; ozone layer is getting depleted, acid rain is being experienced, all as a result of air pollution. This study seeks to investigate the effect of air pollution on dew water. Thirty-one (31) samples of dew water were collected in four locations in Ado- Ekiti, Ekiti State Nigeria. Analytical studies of the dew water samples were carried out to determine the pH, Total Dissolved Solids (TDS) and Electrical Conductivity (EC) in order to determine whether the dew water is polluted or not. There is no documented world standard for dew water quality. However, the standard for normal rain water which is pH between 5.0-5.6 and acid rain pH between 4.0-4.4 was adopted for this study. The pH of dew water samples collected and analyzed ranged between 5.5 and 7.9 in Olokun Ado-Ekiti while other samples fell in between this range. In Government Reserved Area (GRA), Ajilosun and EKSU school area, the pH ranged between 6.4 and 7.9 while EC fell in between 0.0 and 0.9 mS/cm which shows that the observed zones are polluted. Everyone has a role to play in order to reduce the pollutants being released into the atmosphere. There is a need to develop an international standard for dew water quality.Keywords: dew, air pollution, total dissolved solids, electrical conductivity, Ado-Ekiti
Procedia PDF Downloads 192