Search results for: mean bias error
1313 A Novel Image Steganography Method Based on Mandelbrot Fractal
Authors: Adnan H. M. Al-Helali, Hamza A. Ali
Abstract:
The growth of censorship and pervasive monitoring on the Internet, Steganography arises as a new means of achieving secret communication. Steganography is the art and science of embedding information within electronic media used by common applications and systems. Generally, hiding information of multimedia within images will change some of their properties that may introduce few degradation or unusual characteristics. This paper presents a new image steganography approach for hiding information of multimedia (images, text, and audio) using generated Mandelbrot Fractal image as a cover. The proposed technique has been extensively tested with different images. The results show that the method is a very secure means of hiding and retrieving steganographic information. Experimental results demonstrate that an effective improvement in the values of the Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Normalized Cross Correlation (NCC), and Image Fidelity (IF) over the pervious techniques.Keywords: fractal image, information hiding, Mandelbrot set fractal, steganography
Procedia PDF Downloads 6181312 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 651311 Understanding and Improving Neural Network Weight Initialization
Authors: Diego Aguirre, Olac Fuentes
Abstract:
In this paper, we present a taxonomy of weight initialization schemes used in deep learning. We survey the most representative techniques in each class and compare them in terms of overhead cost, convergence rate, and applicability. We also introduce a new weight initialization scheme. In this technique, we perform an initial feedforward pass through the network using an initialization mini-batch. Using statistics obtained from this pass, we initialize the weights of the network, so the following properties are met: 1) weight matrices are orthogonal; 2) ReLU layers produce a predetermined number of non-zero activations; 3) the output produced by each internal layer has a unit variance; 4) weights in the last layer are chosen to minimize the error in the initial mini-batch. We evaluate our method on three popular architectures, and a faster converge rates are achieved on the MNIST, CIFAR-10/100, and ImageNet datasets when compared to state-of-the-art initialization techniques.Keywords: deep learning, image classification, supervised learning, weight initialization
Procedia PDF Downloads 1351310 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation
Authors: Calorine Twebaze, Jesca Balinga
Abstract:
Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches
Procedia PDF Downloads 591309 An Approach for Modeling CMOS Gates
Authors: Spyridon Nikolaidis
Abstract:
A modeling approach for CMOS gates is presented based on the use of the equivalent inverter. A new model for the inverter has been developed using a simplified transistor current model which incorporates the nanoscale effects for the planar technology. Parametric expressions for the output voltage are provided as well as the values of the output and supply current to be compatible with the CCS technology. The model is parametric according the input signal slew, output load, transistor widths, supply voltage, temperature and process. The transistor widths of the equivalent inverter are determined by HSPICE simulations and parametric expressions are developed for that using a fitting procedure. Results for the NAND gate shows that the proposed approach offers sufficient accuracy with an average error in propagation delay about 5%.Keywords: CMOS gate modeling, inverter modeling, transistor current mode, timing model
Procedia PDF Downloads 4231308 Sentiment Analysis of Consumers’ Perceptions on Social Media about the Main Mobile Providers in Jamaica
Authors: Sherrene Bogle, Verlia Bogle, Tyrone Anderson
Abstract:
In recent years, organizations have become increasingly interested in the possibility of analyzing social media as a means of gaining meaningful feedback about their products and services. The aspect based sentiment analysis approach is used to predict the sentiment for Twitter datasets for Digicel and Lime, the main mobile companies in Jamaica, using supervised learning classification techniques. The results indicate an average of 82.2 percent accuracy in classifying tweets when comparing three separate classification algorithms against the purported baseline of 70 percent and an average root mean squared error of 0.31. These results indicate that the analysis of sentiment on social media in order to gain customer feedback can be a viable solution for mobile companies looking to improve business performance.Keywords: machine learning, sentiment analysis, social media, supervised learning
Procedia PDF Downloads 4441307 Uncertainties and Resilience: A Study of Pandemic Impact on the Pastoral-Nomadic Communities in India
Authors: Arati S. Kade, Iftikhar Hussain, Somnath Dadas
Abstract:
The paper studies resilience and uncertainties among nomadic-pastoral communities in India during large events such as pandemics and attempts to understand that with changing times and increased uncertainties, how nomadic communities historically showed their resilience. A review of the literature was performed concerning nomadism and development relations and conflicts by focusing on structural violence on nomadic communities from the caste class and patriarchy as a framework along with the role of the state. Philosophical views on the anti-nomad bias of political theories by Erik Ringmar, along with the decolonial approach by Linda Smith and debrahmanization by Braj Ranjan Mani were used to analyze criminalization of nomads. Data were collected using in-depth telephonic interviews and news reports published during the COVID-19 lockdown in India. Focusing on historical context of current crises, the paper leads to the discussion on how nomadic communities negotiate with the sedentary society during the COVID-19 pandemic. Findings of the current paper approve the hypotheses that the COVID-19 pandemic followed by lockdown deeply impacted the pastoral production system, building on the continued cycle of marginalization by the state and caste society in India, while traditional knowledge stood the test of time. Be it developmental states or pandemics, the nomadic communities have shown their resilience in a number of ways, such as keeping distance from sedentary society, usage of traditional medicine, and relying on traditional leadership.Keywords: COVID-19, criminalization, India, nomadism, pandemic, pastoralism, resilience, traditional knowledge
Procedia PDF Downloads 971306 Measuring How Brightness Mediates Auditory Salience
Authors: Baptiste Bouvier
Abstract:
While we are constantly flooded with stimuli in daily life, attention allows us to select the ones we specifically process and ignore the others. Some salient stimuli may sometimes pass this filter independently of our will, in a "bottom-up" way. The role of the acoustic properties of the timbre of a sound on its salience, i.e., its ability to capture the attention of a listener, is still not well understood. We implemented a paradigm called the "additional singleton paradigm", in which participants have to discriminate targets according to their duration. This task is perturbed (higher error rates and longer response times) by the presence of an irrelevant additional sound, of which we can manipulate a feature of our choice at equal loudness. This allows us to highlight the influence of the timbre features of a sound stimulus on its salience at equal loudness. We have shown that a stimulus that is brighter than the others but not louder leads to an attentional capture phenomenon in this framework. This work opens the door to the study of the influence of any timbre feature on salience.Keywords: attention, audition, bottom-up attention, psychoacoustics, salience, timbre
Procedia PDF Downloads 1711305 Disrupting Microaggressions in the Academic Workplace: The Role of Bystanders
Authors: Tugba Metinyurt
Abstract:
Microaggressions are small, everyday verbal and behavioral slights that communicate derogatory messages to individuals on the basis of their group membership. They are often unintentional and not intended to do harm, and yet research has shown that their cumulative effect can be quite detrimental. The current pilot study focuses on the role of bystanders disrupting gender microaggressions and potential barriers of challenging them in the academic workplace at University of Massachusetts Lowell (UML). The participants in this study included 9 male and 20 female from faculty of different disciplines at UML. A Barriers to Intervening Questionnaire asks respondents 1) to rate barriers to intervening in situations described in three short vignettes and 2) to identify more general factors that make it more or less likely that UML faculty will intervene in microaggressions as bystanders through response to an open-ended question. Responses to the questionnaire scales that ask about respondents’ own reactions to the vignettes indicated that faculty may hesitate to interrupt gender microaggressions to avoid being perceived as offensive, losing their relationship with their coworkers, and engaging possible arguments. Responses to the open-ended question, which asked more generally about perceived barriers, revealed a few additional barriers; lack of interpersonal and institutional support, repercussion to self, personal orientation/personality, and privilege. Interestingly, participants tended to describe the obstacles presented in the questionnaire as unlikely to prevent them from intervening, yet the same barriers were suggested to be issues for others on the open-ended questions. Limitations and future directions are discussed. The barriers identified in this research can inform efforts to create bystander trainings to interrupt microaggressions in the academic workplaces.Keywords: academic workplace, bystander behavior, implicit bias, microaggressions
Procedia PDF Downloads 1521304 Effect of Clinical Depression on Automatic Speaker Verification
Authors: Sheeraz Memon, Namunu C. Maddage, Margaret Lech, Nicholas Allen
Abstract:
The effect of a clinical environment on the accuracy of the speaker verification was tested. The speaker verification tests were performed within homogeneous environments containing clinically depressed speakers only, and non-depresses speakers only, as well as within mixed environments containing different mixtures of both climatically depressed and non-depressed speakers. The speaker verification framework included the MFCCs features and the GMM modeling and classification method. The speaker verification experiments within homogeneous environments showed 5.1% increase of the EER within the clinically depressed environment when compared to the non-depressed environment. It indicated that the clinical depression increases the intra-speaker variability and makes the speaker verification task more challenging. Experiments with mixed environments indicated that the increase of the percentage of the depressed individuals within a mixed environment increases the speaker verification equal error rates.Keywords: speaker verification, GMM, EM, clinical environment, clinical depression
Procedia PDF Downloads 3751303 Influence of Chirp of High-Speed Laser Diodes and Fiber Dispersion on Performance of Non-Amplified 40-Gbps Optical Fiber Links
Authors: Ahmed Bakry, Moustafa Ahmed
Abstract:
We model and simulate the combined effect of fiber dispersion and frequency chirp of a directly modulated high-speed laser diode on the figures of merit of a non-amplified 40-Gbps optical fiber link. We consider both the return to zero (RZ) and non-return to zero (NRZ) patterns of the pseudorandom modulation bits. The performance of the fiber communication system is assessed by the fiber-length limitation due to the fiber dispersion. We study the influence of replacing standard single-mode fibers by non-zero dispersion-shifted fibers on the maximum fiber length and evaluate the associated power penalty. We introduce new dispersion tolerances for 1-dB power penalty of the RZ and NRZ 40-Gbps optical fiber links.Keywords: bit error rate, dispersion, frequency chirp, fiber communications, semiconductor laser
Procedia PDF Downloads 6411302 Numerical Modeling for Water Engineering and Obstacle Theory
Authors: Mounir Adal, Baalal Azeddine, Afifi Moulay Larbi
Abstract:
Numerical analysis is a branch of mathematics devoted to the development of iterative matrix calculation techniques. We are searching for operations optimization as objective to calculate and solve systems of equations of order n with time and energy saving for computers that are conducted to calculate and analyze big data by solving matrix equations. Furthermore, this scientific discipline is producing results with a margin of error of approximation called rates. Thus, the results obtained from the numerical analysis techniques that are held on computer software such as MATLAB or Simulink offers a preliminary diagnosis of the situation of the environment or space targets. By this we can offer technical procedures needed for engineering or scientific studies exploitable by engineers for water.Keywords: numerical analysis methods, obstacles solving, engineering, simulation, numerical modeling, iteration, computer, MATLAB, water, underground, velocity
Procedia PDF Downloads 4631301 I Feel Pretty: Using Discretization to Unpack Gender Disparity in Musical Theatre - A Study of Leonard Bernstein’s West Side Story
Authors: Erin McKellar, Narelle Yeo
Abstract:
Gender disparity can be found in the representation of the female characters in Leonard Bernstein’s musical West Side Story. As a postmodern composer, Bernstein was open about his social activism, yet did not consider his compositional portrayal of female characters as part of that activism. Using discretization as an analysis tool, this thesis explores the melodic contours of male and female songs in West Side Story to show differences in complexity between male and female characterisation. The analysis explores the intervallic relationship between the vocal line and melodic color in relation to the accompaniment harmony, taking into consideration the use of consonance and dissonance. West Side Story is commonly known for its distinct use of the tritone motif and its inherent dissonance. It is evident when reviewing the findings of this study that there is a distinct disparity between male-led and female-led music. The male-led numbers consistently adhere to a dissonant aesthetic with the tritone motif implemented in all of the extracted songs. By contrast, the female songs remain consonant with simple intervallic movements. By examining the results of this study through the lens of Equality Feminism, this thesis finds that Bernstein has simplified the characterisations of the female leads. The thesis further proposes that without cognisant consideration of the compositional portrayal of women, the musical theatre will continue to reinforce gender stereotypes, as evident through this study of Bernstein’s West Side Story.Keywords: music theatre, gender bias, composition, Leonard Bernstein
Procedia PDF Downloads 1621300 Joint Discrete Hartley Transform-Clipping for Peak to Average Power Ratio Reduction in Orthogonal Frequency Division Multiplexing System
Authors: Selcuk Comlekci, Mohammed Aboajmaa
Abstract:
Orthogonal frequency division multiplexing (OFDM) is promising technique for the modern wireless communications systems due to its robustness against multipath environment. The high peak to average power ratio (PAPR) of the transmitted signal is one of the major drawbacks of OFDM system, PAPR degrade the performance of bit error rate (BER) and effect on the linear characteristics of high power amplifier (HPA). In this paper, we proposed DHT-Clipping reduction technique to reduce the high PAPR by the combination between discrete Hartley transform (DHT) and Clipping techniques. From the simulation results, we notified that DHT-Clipping technique offers better PAPR reduction than DHT and Clipping, as well as DHT-Clipping introduce improved BER performance better than clipping.Keywords: ISI, cyclic prefix, BER, PAPR, HPA, DHT, subcarrier
Procedia PDF Downloads 4391299 Dynamic Modeling of a Robot for Playing a Curved 3D Percussion Instrument Utilizing a Finite Element Method
Authors: Prakash Persad, Kelvin Loutan, Trichelle Seepersad
Abstract:
The Finite Element Method is commonly used in the analysis of flexible manipulators to predict elastic displacements and develop joint control schemes for reducing positioning error. In order to preserve simplicity, regular geometries, ideal joints and connections are assumed. This paper presents the dynamic FE analysis of a 4- degrees of freedom open chain manipulator, intended for striking a curved 3D surface percussion musical instrument. This was done utilizing the new MultiBody Dynamics Module in COMSOL, capable of modeling the elastic behavior of a body undergoing rigid body type motion.Keywords: dynamic modeling, entertainment robots, finite element method, flexible robot manipulators, multibody dynamics, musical robots
Procedia PDF Downloads 3371298 Fast Short-Term Electrical Load Forecasting under High Meteorological Variability with a Multiple Equation Time Series Approach
Authors: Charline David, Alexandre Blondin Massé, Arnaud Zinflou
Abstract:
In 2016, Clements, Hurn, and Li proposed a multiple equation time series approach for the short-term load forecasting, reporting an average mean absolute percentage error (MAPE) of 1.36% on an 11-years dataset for the Queensland region in Australia. We present an adaptation of their model to the electrical power load consumption for the whole Quebec province in Canada. More precisely, we take into account two additional meteorological variables — cloudiness and wind speed — on top of temperature, as well as the use of multiple meteorological measurements taken at different locations on the territory. We also consider other minor improvements. Our final model shows an average MAPE score of 1:79% over an 8-years dataset.Keywords: short-term load forecasting, special days, time series, multiple equations, parallelization, clustering
Procedia PDF Downloads 1031297 Three-Dimensional Jet Refraction Simulation Using a Gradient Term Suppression and Filtering Method
Authors: Lican Wang, Rongqian Chen, Yancheng You, Ruofan Qiu
Abstract:
In the applications of jet engine, open-jet wind tunnel and airframe, there wildly exists a shear layer formed by the velocity and temperature gradients between jet flow and surrounded medium. The presence of shear layer will refract and reflect the sound path that consequently influences the measurement results in far-field. To investigate and evaluate the shear layer effect, a gradient term suppression and filtering method is adopted to simulate sound propagation through a steady sheared flow in three dimensions. Two typical configurations are considered: one is an incompressible and cold jet flow in wind tunnel and the other is a compressible and hot jet flow in turbofan engine. A numerically linear microphone array is used to localize the position of given sound source. The localization error is presented and linearly fitted.Keywords: aeroacoustic, linearized Euler equation, acoustic propagation, source localization
Procedia PDF Downloads 2041296 Performance Evaluation of One and Two Dimensional Prime Codes for Optical Code Division Multiple Access Systems
Authors: Gurjit Kaur, Neena Gupta
Abstract:
In this paper, we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension, i.e. time slots, whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research, we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for Optical Code Division Multiple Access (OCDMA) system on a single platform. Analysis shows that 2D prime code supports lesser number of active users than 1D codes, but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa
Procedia PDF Downloads 3371295 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate
Procedia PDF Downloads 1521294 Comparative Analysis of Universal Filtered Multi Carrier and Filtered Orthogonal Frequency Division Multiplexing Systems for Wireless Communications
Authors: Raja Rajeswari K
Abstract:
Orthogonal Frequency Division Multiplexing (OFDM), a multi Carrier transmission technique that has been used in implementing the majority of wireless applications like Wireless Network Protocol Standards (like IEEE 802.11a, IEEE 802.11n), in telecommunications (like LTE, LTE-Advanced) and also in Digital Audio & Video Broadcast standards. The latest research and development in the area of orthogonal frequency division multiplexing, Universal Filtered Multi Carrier (UFMC) & Filtered OFDM (F-OFDM) has attracted lots of attention for wideband wireless communications. In this paper UFMC & F-OFDM system are implemented and comparative analysis are carried out in terms of M-ary QAM modulation scheme over Dolph-chebyshev filter & rectangular window filter and to estimate Bit Error Rate (BER) over Rayleigh fading channel.Keywords: UFMC, F-OFDM, BER, M-ary QAM
Procedia PDF Downloads 1711293 Compliance of Systematic Reviews in Plastic Surgery with the PRISMA Statement: A Systematic Review
Authors: Seon-Young Lee, Harkiran Sagoo, Katherine Whitehurst, Georgina Wellstead, Alexander Fowler, Riaz Agha, Dennis Orgill
Abstract:
Introduction: Systematic reviews attempt to answer research questions by synthesising the data within primary papers. They are an increasingly important tool within evidence-based medicine, guiding clinical practice, future research and healthcare policy. We sought to determine the reporting quality of recent systematic reviews in plastic surgery. Methods: This systematic review was conducted in line with the Cochrane handbook, reported in line with the PRISMA statement and registered at the ResearchRegistry (UIN: reviewregistry18). MEDLINE and EMBASE databases were searched in 2013 and 2014 for systematic reviews by five major plastic surgery journals. Screening, identification and data extraction was performed independently by two teams. Results: From an initial set of 163 articles, 79 met the inclusion criteria. The median PRISMA score was 16 out of 27 items (59.3%; range 6-26, 95% CI 14-17). Compliance between individual PRISMA items showed high variability. It was poorest for items related to the use of review protocol (item 5; 5%) and presentation of data on risk of bias of each study (item 19; 18%), while being the highest for description of rationale (item 3; 99%) and sources of funding and other support (item 27; 95%), and for structured summary in the abstract (item 2; 95%). Conclusion: The reporting quality of systematic reviews in plastic surgery requires improvement. ‘Hard-wiring’ of compliance through journal submission systems, as well as improved education, awareness and a cohesive strategy among all stakeholders is called for.Keywords: PRISMA, reporting quality, plastic surgery, systematic review, meta-analysis
Procedia PDF Downloads 2941292 Design of Speedy, Scanty Adder for Lossy Application Using QCA
Authors: T. Angeline Priyanka, R. Ganesan
Abstract:
Recent trends in microelectronics technology have gradually changed the strategies used in very large scale integration (VLSI) circuits. Complementary Metal Oxide Semiconductor (CMOS) technology has been the industry standard for implementing VLSI device for the past two decades, but due to scale-down issues of ultra-low dimension achievement is not achieved so far. Hence it paved a way for Quantum Cellular Automata (QCA). It is only one of the many alternative technologies proposed as a replacement solution to the fundamental limit problem that CMOS technology will impose in the years to come. In this brief, presented a new adder that possesses high speed of operation occupying less area is proposed. This adder is designed especially for error tolerant application. Hence in the proposed adder, the overall area (cell count) and simulation time are reduced by 88 and 73 percent respectively. Various results of the proposed adder are shown and described.Keywords: quantum cellular automata, carry look ahead adder, ripple carry adder, lossy application, majority gate, crossover
Procedia PDF Downloads 5561291 Analysis of Moving Loads on Bridges Using Surrogate Models
Authors: Susmita Panda, Arnab Banerjee, Ajinkya Baxy, Bappaditya Manna
Abstract:
The design of short to medium-span high-speed bridges in critical locations is an essential aspect of vehicle-bridge interaction. Due to dynamic interaction between moving load and bridge, mathematical models or finite element modeling computations become time-consuming. Thus, to reduce the computational effort, a universal approximator using an artificial neural network (ANN) has been used to evaluate the dynamic response of the bridge. The data set generation and training of surrogate models have been conducted over the results obtained from mathematical modeling. Further, the robustness of the surrogate model has been investigated, which showed an error percentage of less than 10% with conventional methods. Additionally, the dependency of the dynamic response of the bridge on various load and bridge parameters has been highlighted through a parametric study.Keywords: artificial neural network, mode superposition method, moving load analysis, surrogate models
Procedia PDF Downloads 1001290 Frenotomy for Tongue Tie: The Unlikely Benefit of Massage
Authors: Kailas Bhandarkar, Talib Dar, Laura Karia, Manasvi Upadhyaya
Abstract:
Introduction: Frenotomy for tongue tie is commonly performed in breastfed infants who experience difficulty in latching after failed conservative management for tongue tie. However, there is no consensus for the routine use of massage following frenotomy. Our aim was to assess the efficacy of massage in preventing recurrence following frenotomy. Methods: The tongue tie service in our tertiary referral hospital consists of 5 consultants and a breastfeeding (BF) midwife. 3 consultants routinely advice massage post procedure. Babies are assessed by the midwife after the procedure and a follow-up consultation after a week. After due ethical approval, data were collected by two staff members who were independent of TT service on a standardized questionnaire to avoid bias. Fischer exact test was employed (p < 0.05 considered significant). Results: Six hundred and thirty-two babies attended the clinic from January 2018 to December 2018. Thirty-three of these were excluded as the procedure was not needed. Parents were contacted at a median of six months post-procedure (range 2-10 months). 282/599 were advised massage. 92/282 could be contacted. 40/ 92 adhered to massage regimen. None of these had a recurrence. 52/92 (54%), although advised, did not perform massage. Reasons cited for lack of adherence to massage included difficulty in performing massaging and conflicting advice given by other health care professionals involved in patient care like paediatricians and group practice and lack of information on the internet). Overall, 4/599 (0.66%) had recurrences, and this difference was not statistically significant. Conclusion: In our experience, the rate of recurrence after frenotomy is low enough for us to conclude that there is no significant benefit of massage after frenotomy for tongue tie. We could also conclude that among parents who were advised massage more than half failed to adhere to the advice.Keywords: tongue tie, frenotomy, massage, recurrence
Procedia PDF Downloads 1351289 Robotic Assisted vs Traditional Laparoscopic Partial Nephrectomy Peri-Operative Outcomes: A Comparative Single Surgeon Study
Authors: Gerard Bray, Derek Mao, Arya Bahadori, Sachinka Ranasinghe
Abstract:
The EAU currently recommends partial nephrectomy as the preferred management for localised cT1 renal tumours, irrespective of surgical approach. With the advent of robotic assisted partial nephrectomy, there is growing evidence that warm ischaemia time may be reduced compared to the traditional laparoscopic approach. There is still no clear differences between the two approaches with regards to other peri-operative and oncological outcomes. Current limitations in the field denote the lack of single surgeon series to compare the two approaches as other studies often include multiple operators of different experience levels. To the best of our knowledge, this study is the first single surgeon series comparing peri-operative outcomes of robotic assisted and laparoscopic PN. The current study aims to reduce intra-operator bias while maintaining an adequate sample size to assess the differences in outcomes between the two approaches. We retrospectively compared patient demographics, peri-operative outcomes, and renal function derangements of all partial nephrectomies undertaken by a single surgeon with experience in both laparoscopic and robotic surgery. Warm ischaemia time, length of stay, and acute renal function deterioration were all significantly reduced with robotic partial nephrectomy, compared to laparoscopic nephrectomy. This study highlights the benefits of robotic partial nephrectomy. Further prospective studies with larger sample sizes would be valuable additions to the current literature.Keywords: partial nephrectomy, robotic assisted partial nephrectomy, warm ischaemia time, peri-operative outcomes
Procedia PDF Downloads 1411288 Estimation of Fuel Cost Function Characteristics Using Cuckoo Search
Authors: M. R. Al-Rashidi, K. M. El-Naggar, M. F. Al-Hajri
Abstract:
The fuel cost function describes the electric power generation-cost relationship in thermal plants, hence, it sheds light on economical aspects of power industry. Different models have been proposed to describe this relationship with the quadratic function model being the most popular one. Parameters of second order fuel cost function are estimated in this paper using cuckoo search algorithm. It is a new population based meta-heuristic optimization technique that has been used in this study primarily as an accurate estimation tool. Its main features are flexibility, simplicity, and effectiveness when compared to other estimation techniques. The parameter estimation problem is formulated as an optimization one with the goal being minimizing the error associated with the estimated parameters. A case study is considered in this paper to illustrate cuckoo search promising potential as a valuable estimation and optimization technique.Keywords: cuckoo search, parameters estimation, fuel cost function, economic dispatch
Procedia PDF Downloads 5811287 Nanocomposites Based Micro/Nano Electro-Mechanical Systems for Energy Harvesters and Photodetectors
Authors: Radhamanohar Aepuru, R. V. Mangalaraja
Abstract:
Flexible electronic devices have drawn potential interest and provide significant new insights to develop energy conversion and storage devices such as photodetectors and nanogenerators. Recently, self-powered electronic systems have captivated huge attention for next generation MEMS/NEMS devices that can operate independently by generating built-in field without any need of external bias voltage and have wide variety of applications in telecommunication, imaging, environmental and defence sectors. The basic physical process involved in these devices are charge generation, separation, and charge flow across the electrodes. Many inorganic nanostructures have been exploring to fabricate various optoelectronic and electromechanical devices. However, the interaction of nanostructures and their excited charge carrier dynamics, photoinduced charge separation, and fast carrier mobility are yet to be studied. The proposed research is to address one such area and to realize the self-powered electronic devices. In the present work, nanocomposites of inorganic nanostructures based on ZnO, metal halide perovskites; and polyvinylidene fluoride (PVDF) based nanocomposites are realized for photodetectors and nanogenerators. The characterization of the inorganic nanostructures is carried out through steady state optical absorption and luminescence spectroscopies as well as X-ray diffraction and high-resolution transmission electron microscopy (TEM) studies. The detailed carrier dynamics is investigated using various spectroscopic techniques. The developed composite nanostructures exhibit significant optical and electrical properties, which have wide potential applications in various MEMS/NEMS devices such as photodetectors and nanogenerators.Keywords: dielectrics, nanocomposites, nanogenerators, photodetectors
Procedia PDF Downloads 1291286 Improving Our Understanding of the in vivo Modelling of Psychotic Disorders
Authors: Zsanett Bahor, Cristina Nunes-Fonseca, Gillian L. Currie, Emily S. Sena, Lindsay D.G. Thomson, Malcolm R. Macleod
Abstract:
Psychosis is ranked as the third most disabling medical condition in the world by the World Health Organization. Despite a substantial amount of research in recent years, available treatments are not universally effective and have a wide range of adverse side effects. Since many clinical drug candidates are identified through in vivo modelling, a deeper understanding of these models, and their strengths and limitations, might help us understand reasons for difficulties in psychosis drug development. To provide an unbiased summary of the preclinical psychosis literature we performed a systematic electronic search of PubMed for publications modelling a psychotic disorder in vivo, identifying 14,721 relevant studies. Double screening of 11,000 publications from this dataset so far established 2403 animal studies of psychosis, with the most common model being schizophrenia (95%). 61% of these models are induced using pharmacological agents. For all the models only 56% of publications test a therapeutic treatment. We propose a systematic review of these studies to assess the prevalence of reporting of measures to reduce risk of bias, and a meta-analysis to assess the internal and external validity of these animal models. Our findings are likely to be relevant to future preclinical studies of psychosis as this generation of strong empirical evidence has the potential to identify weaknesses, areas for improvement and make suggestions on refinement of experimental design. Such a detailed understanding of the data which inform what we think we know will help improve the current attrition rate between bench and bedside in psychosis research.Keywords: animal models, psychosis, systematic review, schizophrenia
Procedia PDF Downloads 2901285 Adsorption Studies of Methane on Zeolite NaX, LiX, KX at High Pressures
Authors: El Hadi Zouaoui, Djamel Nibou, Mohamed Haddouche, Wan Azlina Wan Ab Karim Ghani, Samira Amokrane
Abstract:
In this study, CH₄ adsorption isotherms on NaX or Faujasite X and exchanged zeolites with Li⁺(LiX), and K⁺(KX) at different temperatures (298, 308, 323 and 353 K) has been investigated, using high pressure (3 MPa (30 bar)) thermo-gravimetric analyser. The experimental results were then validated using several isothermal kinetics models, namely Langmuir, Toth, and Marczewski-Jaroniec, followed by a calculation of the error coefficients between the experimental and theoretical results. It was found that the CH₄ adsorption isotherms are characterized by a strong increase in adsorption at low pressure and a tendency towards a high pressure limit value Qₘₐₓ. The size and position of the exchanged cations, the spherical shape of methane, the specific surface, and the volume of the pores revealed the most important influence parameters for this study. These results revealed that the experimentation and the modeling, well correlated with Marczewski-Jaroniec, Toth, and gave the best results whatever the temperature and the material used.Keywords: CH₄ adsorption, exchange cations, exchanged zeolite, isotherm study, NaX zeolite
Procedia PDF Downloads 2491284 Impact of Weather Conditions on Generalized Frequency Division Multiplexing over Gamma Gamma Channel
Authors: Muhammad Sameer Ahmed, Piotr Remlein, Tansal Gucluoglu
Abstract:
The technique called as Generalized frequency division multiplexing (GFDM) used in the free space optical channel can be a good option for implementation free space optical communication systems. This technique has several strengths e.g. good spectral efficiency, low peak-to-average power ratio (PAPR), adaptability and low co-channel interference. In this paper, the impact of weather conditions such as haze, rain and fog on GFDM over the gamma-gamma channel model is discussed. A Trade off between link distance and system performance under intense weather conditions is also analysed. The symbol error probability (SEP) of GFDM over the gamma-gamma turbulence channel is derived and verified with the computer simulations.Keywords: free space optics, generalized frequency division multiplexing, weather conditions, gamma gamma distribution
Procedia PDF Downloads 174