Search results for: time domain reflectometry (TDR)
17208 Assessment of Time-variant Work Stress for Human Error Prevention
Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee
Abstract:
For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention
Procedia PDF Downloads 67217207 A Procedure for Post-Earthquake Damage Estimation Based on Detection of High-Frequency Transients
Authors: Aleksandar Zhelyazkov, Daniele Zonta, Helmut Wenzel, Peter Furtner
Abstract:
In the current research structural health monitoring is considered for addressing the critical issue of post-earthquake damage detection. A non-standard approach for damage detection via acoustic emission is presented - acoustic emissions are monitored in the low frequency range (up to 120 Hz). Such emissions are termed high-frequency transients. Further a damage indicator defined as the Time-Ratio Damage Indicator is introduced. The indicator relies on time-instance measurements of damage initiation and deformation peaks. Based on the time-instance measurements a procedure for estimation of the maximum drift ratio is proposed. Monitoring data is used from a shaking-table test of a full-scale reinforced concrete bridge pier. Damage of the experimental column is successfully detected and the proposed damage indicator is calculated.Keywords: acoustic emission, damage detection, shaking table test, structural health monitoring
Procedia PDF Downloads 23117206 Reconsidering Taylor’s Law with Chaotic Population Dynamical Systems
Authors: Yuzuru Mitsui, Takashi Ikegami
Abstract:
The exponents of Taylor’s law in deterministic chaotic systems are computed, and their meanings are intensively discussed. Taylor’s law is the scaling relationship between the mean and variance (in both space and time) of population abundance, and this law is known to hold in a variety of ecological time series. The exponents found in the temporal Taylor’s law are different from those of the spatial Taylor’s law. The temporal Taylor’s law is calculated on the time series from the same locations (or the same initial states) of different temporal phases. However, with the spatial Taylor’s law, the mean and variance are calculated from the same temporal phase sampled from different places. Most previous studies were done with stochastic models, but we computed the temporal and spatial Taylor’s law in deterministic systems. The temporal Taylor’s law evaluated using the same initial state, and the spatial Taylor’s law was evaluated using the ensemble average and variance. There were two main discoveries from this work. First, it is often stated that deterministic systems tend to have the value two for Taylor’s exponent. However, most of the calculated exponents here were not two. Second, we investigated the relationships between chaotic features measured by the Lyapunov exponent, the correlation dimension, and other indexes with Taylor’s exponents. No strong correlations were found; however, there is some relationship in the same model, but with different parameter values, and we will discuss the meaning of those results at the end of this paper.Keywords: chaos, density effect, population dynamics, Taylor’s law
Procedia PDF Downloads 17417205 Modeling and Prediction of Zinc Extraction Efficiency from Concentrate by Operating Condition and Using Artificial Neural Networks
Authors: S. Mousavian, D. Ashouri, F. Mousavian, V. Nikkhah Rashidabad, N. Ghazinia
Abstract:
PH, temperature, and time of extraction of each stage, agitation speed, and delay time between stages effect on efficiency of zinc extraction from concentrate. In this research, efficiency of zinc extraction was predicted as a function of mentioned variable by artificial neural networks (ANN). ANN with different layer was employed and the result show that the networks with 8 neurons in hidden layer has good agreement with experimental data.Keywords: zinc extraction, efficiency, neural networks, operating condition
Procedia PDF Downloads 54517204 Numerical Simulation of Supersonic Gas Jet Flows and Acoustics Fields
Authors: Lei Zhang, Wen-jun Ruan, Hao Wang, Peng-Xin Wang
Abstract:
The source of the jet noise is generated by rocket exhaust plume during rocket engine testing. A domain decomposition approach is applied to the jet noise prediction in this paper. The aerodynamic noise coupling is based on the splitting into acoustic sources generation and sound propagation in separate physical domains. Large Eddy Simulation (LES) is used to simulate the supersonic jet flow. Based on the simulation results of the flow-fields, the jet noise distribution of the sound pressure level is obtained by applying the Ffowcs Williams-Hawkings (FW-H) acoustics equation and Fourier transform. The calculation results show that the complex structures of expansion waves, compression waves and the turbulent boundary layer could occur due to the strong interaction between the gas jet and the ambient air. In addition, the jet core region, the shock cell and the sound pressure level of the gas jet increase with the nozzle size increasing. Importantly, the numerical simulation results of the far-field sound are in good agreement with the experimental measurements in directivity.Keywords: supersonic gas jet, Large Eddy Simulation(LES), acoustic noise, Ffowcs Williams-Hawkings(FW-H) equations, nozzle size
Procedia PDF Downloads 41317203 Hatching Rhythm, Larval Release of the Rocky Intertidal Crab Leptoduis exaratus (Brachyura: Xanthidae) in Kuwait, Arabian Gulf
Authors: Zainab Al-Wazzan, Luis Gimenez, Lewis Le Vay, Manaf Behbehani
Abstract:
The hatching rhythm and larval release patterns of the rocky shore crab Leptoduis exaratus was investigated in relation to the tidal cycle, the time of the day, and lunar cycle. Ovigerous females were collected from rocky shores at six sites along the Kuwait coastline between April and July of 2014. The females were kept separated in aquaria under a natural photoperiod cycle and the pattern of larval release was monitored in relation to local tidal and dial cycles. Larval release occurred mostly during the night time, and was highly synchronized with neap tides that followed full moon; at the end of the hatching period, significant larval release occurred also during spring tides. Time series analysis showed a highly significant autocorrelation and the periodicity at a peak of 14-15 days. The cross-correlation analysis between hatching and the daily low tide level suggests that larvae are released about a day before neap tide. Hatching during neap tides occurred early in the night at times of the expected ebb tide. During spring tide period (late in the season), larval release occurred later during night at tides of the ebb tide. The results of this study indicated a strong relationship between the tidal cycle, time of the day and the hatching rhythm of L. exaratus. In addition, the results suggest that water level in the intertidal zone is also playing a very important role in determining the time of the hatching. Hatching and larval release synchronize with the preferred larval environmental conditions to prevent exposing larvae to physiological or environmental stress during their early larval stages. It is also an important factor in determining the larval dispersal.Keywords: brachyura, hatching rhythm, larvae, Kuwait
Procedia PDF Downloads 67717202 Design of Parity-Preserving Reversible Logic Signed Array Multipliers
Authors: Mojtaba Valinataj
Abstract:
Reversible logic as a new favorable design domain can be used for various fields especially creating quantum computers because of its speed and intangible power consumption. However, its susceptibility to a variety of environmental effects may lead to yield the incorrect results. In this paper, because of the importance of multiplication operation in various computing systems, some novel reversible logic array multipliers are proposed with error detection capability by incorporating the parity-preserving gates. The new designs are presented for two main parts of array multipliers, partial product generation and multi-operand addition, by exploiting the new arrangements of existing gates, which results in two signed parity-preserving array multipliers. The experimental results reveal that the best proposed 4×4 multiplier in this paper reaches 12%, 24%, and 26% enhancements in the number of constant inputs, number of required gates, and quantum cost, respectively, compared to previous design. Moreover, the best proposed design is generalized for n×n multipliers with general formulations to estimate the main reversible logic criteria as the functions of the multiplier size.Keywords: array multipliers, Baugh-Wooley method, error detection, parity-preserving gates, quantum computers, reversible logic
Procedia PDF Downloads 25917201 Modeling of Tsunami Propagation and Impact on West Vancouver Island, Canada
Authors: S. Chowdhury, A. Corlett
Abstract:
Large tsunamis strike the British Columbia coast every few hundred years. The Cascadia Subduction Zone, which extends along the Pacific coast from Vancouver Island to Northern California is one of the most seismically active regions in Canada. Significant earthquakes have occurred in this region, including the 1700 Cascade Earthquake with an estimated magnitude of 9.2. Based on geological records, experts have predicted a 'great earthquake' of a similar magnitude within this region may happen any time. This earthquake is expected to generate a large tsunami that could impact the coastal communities on Vancouver Island. Since many of these communities are in remote locations, they are more likely to be vulnerable, as the post-earthquake relief efforts would be impacted by the damage to critical road infrastructures. To assess the coastal vulnerability within these communities, a hydrodynamic model has been developed using MIKE-21 software. We have considered a 500 year probabilistic earthquake design criteria including the subsidence in this model. The bathymetry information was collected from Canadian Hydrographic Services (CHS), and National Oceanic Atmospheric and Administration (NOAA). The arial survey was conducted using a Cessna-172 aircraft for the communities, and then the information was converted to generate a topographic digital elevation map. Both survey information was incorporated into the model, and the domain size of the model was about 1000km x 1300km. This model was calibrated with the tsunami occurred off the west coast of Moresby Island on October 28, 2012. The water levels from the model were compared with two tide gauge stations close to the Vancouver Island and the output from the model indicates the satisfactory result. For this study, the design water level was considered as High Water Level plus the Sea Level Rise for 2100 year. The hourly wind speeds from eight directions were collected from different wind stations and used a 200-year return period wind speed in the model for storm events. The regional model was set for 12 hrs simulation period, which takes more than 16 hrs to complete one simulation using double Xeon-E7 CPU computer plus a K-80 GPU. The boundary information for the local model was generated from the regional model. The local model was developed using a high resolution mesh to estimate the coastal flooding for the communities. It was observed from this study that many communities will be effected by the Cascadia tsunami and the inundation maps were developed for the communities. The infrastructures inside the coastal inundation area were identified. Coastal vulnerability planning and resilient design solutions will be implemented to significantly reduce the risk.Keywords: tsunami, coastal flooding, coastal vulnerable, earthquake, Vancouver, wave propagation
Procedia PDF Downloads 13117200 Human Skin Identification Using a Specific mRNA Marker at Different Storage Durations
Authors: Abla A. Ali, Heba A. Abd El Razik, Nadia A. Kotb, Amany A. Bayoumi, Laila A. Rashed
Abstract:
The detection of human skin through mRNA-based profiling is a very useful tool for forensic investigations. The aim of this study was definitive identification of human skin at different time intervals using an mRNA marker late cornified envelope gene 1C. Ten middle-aged healthy volunteers of both sexes were recruited for this study. Skin samples controlled with blood samples were taken from the candidates to test for the presence of our targeted mRNA marker. Samples were kept at dry dark conditions to be tested at different time intervals (24 hours, one week, three weeks and four weeks) for detection and relative quantification of the targeted marker by RT PCR. The targeted marker could not be detected in blood samples. The targeted marker showed the highest mean value after 24 hours (11.90 ± 2.42) and the lowest mean value (7.56 ± 2.56) after three weeks. No marker could be detected at four weeks. This study verified the high specificity and sensitivity of mRNA marker in the skin at different storage times up to three weeks under the study conditions.Keywords: human skin, late cornified envelope gene 1C, mRNA marker, time intervals
Procedia PDF Downloads 16517199 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control
Authors: Sangwon Han, Chengquan Jin
Abstract:
Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability
Procedia PDF Downloads 18617198 Other-Generated Disclosure: A Challenge to Privacy on Social Network Sites
Authors: Tharntip Tawnie Chutikulrungsee, Oliver Kisalay Burmeister, Maumita Bhattacharya, Dragana Calic
Abstract:
Sharing on social network sites (SNSs) has rapidly emerged as a new social norm and has become a global phenomenon. Billions of users reveal not only their own information (self disclosure) but also information about others (other-generated disclosure), resulting in a risk and a serious threat to either personal or informational privacy. Self-disclosure (SD) has been extensively researched in the literature, particularly regarding control of individual and existing privacy management. However, far too little attention has been paid to other-generated disclosure (OGD), especially by insiders. OGD has a strong influence on self-presentation, self-image, and electronic word of mouth (eWOM). Moreover, OGD is more credible and less likely manipulated than SD, but lacks privacy control and legal protection to some extent. This article examines OGD in depth, ranging from motivation to both online and offline impacts, based upon lived experiences from both ‘the disclosed’ and ‘the discloser’. Using purposive sampling, this phenomenological study involves an online survey and in-depth interviews. The findings report the influence of peer disclosure as well as users’ strategies to mitigate privacy issues. This article also calls attention to the challenge of OGD privacy and inadequacies in the law related to privacy protection in the digital domain.Keywords: facebook, online privacy, other-generated disclosure, social networks sites (SNSs)
Procedia PDF Downloads 25117197 Multiple Linear Regression for Rapid Estimation of Subsurface Resistivity from Apparent Resistivity Measurements
Authors: Sabiu Bala Muhammad, Rosli Saad
Abstract:
Multiple linear regression (MLR) models for fast estimation of true subsurface resistivity from apparent resistivity field measurements are developed and assessed in this study. The parameters investigated were apparent resistivity (ρₐ), horizontal location (X) and depth (Z) of measurement as the independent variables; and true resistivity (ρₜ) as the dependent variable. To achieve linearity in both resistivity variables, datasets were first transformed into logarithmic domain following diagnostic checks of normality of the dependent variable and heteroscedasticity to ensure accurate models. Four MLR models were developed based on hierarchical combination of the independent variables. The generated MLR coefficients were applied to another data set to estimate ρₜ values for validation. Contours of the estimated ρₜ values were plotted and compared to the observed data plots at the colour scale and blanking for visual assessment. The accuracy of the models was assessed using coefficient of determination (R²), standard error (SE) and weighted mean absolute percentage error (wMAPE). It is concluded that the MLR models can estimate ρₜ for with high level of accuracy.Keywords: apparent resistivity, depth, horizontal location, multiple linear regression, true resistivity
Procedia PDF Downloads 27617196 Effects of Feeding Time on Survival Rates, Growth Performance and Feeding Behavior of Juvenile Catfish
Authors: Abdullahi Ibrahim
Abstract:
The culture of Clarias gariepinus for fish production is becoming increasingly essential as the fish is contributing to the food abundance and nutritional benefit to family health, income generation, and employment opportunities. The effect of feeding frequency was investigated over a period of ten (10) weeks; the experiment was conducted to monitor survival rates, growth performance, and feeding behavior of juvenile catfish. The experimental fish were randomly assigned to five treatment groups; (i.e., with different feeding frequency intervals) of 100 fish each. Each treatment was replicated twice with 50 fish per replicate. All the groups were fed with floating fish feed (blue crown®). The five treatments (feeding frequency) were T1- once a day feeding of night hours only, T2- twice a day feeding time of morning and night hours, T3- trice a day feeding time of morning, evening and night hours, T-4 four times a day feeding of morning, afternoon, evening, and night hours, T-5 five times a day feeding at four hours interval. There were significant differences (p > 0.05) among treatments. Feed intake and weight gain improved significantly (p < 0.05) in T-4 and T-3. The best of the feeding time on weight gain, survival rate, and feed conversion ratio were obtained at three times a day feeding (T-3) compared to other treatments, especially those fed once and five times feeding a regiment. This might be attributed to the high level of dissolve oxygen and less stress. Feeding fish three times a day is therefore recommended for efficient catfish production to maximize profits as the feed represents more than 50% of aquaculture inputs, particularly in intensive farming systems.Keywords: catfish, floating fish feed, dissolve oxygen, juvenile
Procedia PDF Downloads 15517195 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients
Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi
Abstract:
Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution
Procedia PDF Downloads 26117194 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students
Authors: Samah Senbel
Abstract:
Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.Keywords: computer science education, database design, graduate and undergraduate students, pedagogy
Procedia PDF Downloads 12117193 Concept, Design and Implementation of Power System Component Simulator Based on Thyristor Controlled Transformer and Power Converter
Authors: B. Kędra, R. Małkowski
Abstract:
This paper presents information on Power System Component Simulator – a device designed for LINTE^2 laboratory owned by Gdansk University of Technology in Poland. In this paper, we first provide an introductory information on the Power System Component Simulator and its capabilities. Then, the concept of the unit is presented. Requirements for the unit are described as well as proposed and introduced functions are listed. Implementation details are given. Hardware structure is presented and described. Information about used communication interface, data maintenance and storage solution, as well as used Simulink real-time features are presented. List and description of all measurements is provided. Potential of laboratory setup modifications is evaluated. Lastly, the results of experiments performed using Power System Component Simulator are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area.Keywords: power converter, Simulink Real-Time, Matlab, load, tap controller
Procedia PDF Downloads 24217192 Volatility Switching between Two Regimes
Authors: Josip Visković, Josip Arnerić, Ante Rozga
Abstract:
Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modelling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.Keywords: central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities
Procedia PDF Downloads 22617191 Enhancing the Sensitivity of Antigen Based Sandwich ELISA for COVID-19 Diagnosis in Saliva Using Gold Conjugated Nanobodies
Authors: Manal Kamel, Sara Maher
Abstract:
Development of sensitive non-invasive tests for detection of SARS-CoV-2 antigens is imperative to manage the extent of infection throughout the population, yet, it is still challenging. Here, we designed and optimized a sandwich enzyme-linked immunosorbent assay (ELISA) for SARS-CoV-2 S1 antigen detection in saliva. Both saliva samples and nasopharyngeal swapswere collected from 170 PCR-confirmed positive and negative cases. Gold nanoparticles (AuNPs) were conjugated with S1protein receptor binding domain (RBD) nanobodies. Recombinant S1 monoclonal antibodies (S1mAb) as primery antibody and gold conjugated nanobodies as secondary antibody were employed in sandwich ELISA. Our developed system were optimized to achieve 87.5 % sensitivity and 100% specificity for saliva samples compared to 89 % and 100% for nasopharyngeal swaps, respectively. This means that saliva could be a suitable replacement for nasopharyngeal swaps No cross reaction was detected with other corona virus antigens. These results revealed that our developed ELISAcould be establishedas a new, reliable, sensitive, and non-invasive test for diagnosis of SARS-CoV-2 infection, using the easily collected saliva samples.Keywords: COVID 19, diagnosis, ELISA, nanobodies
Procedia PDF Downloads 13417190 An Automated System for the Detection of Citrus Greening Disease Based on Visual Descriptors
Authors: Sidra Naeem, Ayesha Naeem, Sahar Rahim, Nadia Nawaz Qadri
Abstract:
Citrus greening is a bacterial disease that causes considerable damage to citrus fruits worldwide. Efficient method for this disease detection must be carried out to minimize the production loss. This paper presents a pattern recognition system that comprises three stages for the detection of citrus greening from Orange leaves: segmentation, feature extraction and classification. Image segmentation is accomplished by adaptive thresholding. The feature extraction stage comprises of three visual descriptors i.e. shape, color and texture. From shape feature we have used asymmetry index, from color feature we have used histogram of Cb component from YCbCr domain and from texture feature we have used local binary pattern. Classification was done using support vector machines and k nearest neighbors. The best performances of the system is Accuracy = 88.02% and AUROC = 90.1% was achieved by automatic segmented images. Our experiments validate that: (1). Segmentation is an imperative preprocessing step for computer assisted diagnosis of citrus greening, and (2). The combination of shape, color and texture features form a complementary set towards the identification of citrus greening disease.Keywords: citrus greening, pattern recognition, feature extraction, classification
Procedia PDF Downloads 18417189 Use and Effects of Kanban Board from the Aspects of Brothers Furniture Limited
Authors: Kazi Rizvan, Yamin Rekhu
Abstract:
Due to high competitiveness in industries throughout the world, every industry is trying hard to utilize all their resources to keep their productivity as high as possible. Many tools have been being used to ensure smoother flow of an operation, to balance tasks, to maintain proper schedules for tasks, to maintain proper sequence for tasks, to reduce unproductive time. All of these tools are used to augment productivity within an industry. Kanban board is one of them and of the many important tools of lean production system. Kanban Board is a visual depiction of the status of tasks. Kanban board shows the actual status of the tasks. It conveys the progress and issues of tasks as well. Using Kanban Board, tasks can be distributed among workers and operation targets can be visually represented to them. In this paper, an example of Kanban board from the aspects of Brothers Furniture Limited was taken and how the Kanban board system was implemented, how the board was designed and how it was made easily perceivable for the less literate or illiterate workers. The Kanban board was designed for the packing section of Brothers Furniture Limited. It was implemented for the purpose of representing the tasks flow to the workers and to mitigate the time that was wasted while the workers remained wondering about what task they should start after they finish one. Kanban board subsumed seven columns and there was a column for comments where if any problem occurred during working on the tasks. Kanban board was helpful for the workers as the board showed the urgency of the tasks. It was also helpful for the store section as they could understand which products and how much of them could be delivered to store at any certain time. Kanban board had all the information centralized which is why the work-flow got paced up and idle time was minimized. Regardless of many workers being illiterate or less literate, Kanban board was still explicable for the workers as the Kanban cards were colored. Since the significance of colors can be conveniently interpretable to them, colored cards helped a great deal in that matter. Hence, the illiterate or less literate workers didn’t have to spend time wondering about the significance of the cards. Even when the workers weren’t told the significance of the colored cards, they could grow a feeling about their meaning as colors can trigger anyone’s mind to perceive the situation. As a result, the board elucidated the workers about what board required them to do, when to do and what to do next. Kanban board alleviated excessive time between tasks by setting day-plan for targeted tasks and it also reduced time during tasks as the workers were acknowledged of forthcoming tasks for a day. Being very specific to the tasks, Kanban board helped the workers become more focused on their tasks helped them do their job with more perfection. As a result, The Kanban board helped achieve a 8.75% increase in productivity than the productivity before the Kanban board was implemented.Keywords: color, Kanban Board, Lean Tool, literacy, packing, productivity
Procedia PDF Downloads 23317188 Component Interface Formalization in Robotic Systems
Authors: Anton Hristozov, Eric Matson, Eric Dietz, Marcus Rogers
Abstract:
Components are heavily used in many software systems, including robotics systems. The growth of sophistication and diversity of new capabilities for robotic systems presents new challenges to their architectures. Their complexity is growing exponentially with the advent of AI, smart sensors, and the complex tasks they have to accomplish. Such complexity requires a more rigorous approach to the creation, use, and interoperability of software components. The issue is exacerbated because robotic systems are becoming more and more reliant on third-party components for certain functions. In order to achieve this kind of interoperability, including dynamic component replacement, we need a way to standardize their interfaces. A formal approach is desperately needed to specify what an interface of a robotic software component should contain. This study performs an analysis of the issue and presents a universal and generic approach to standardizing component interfaces for robotic systems. Our approach is inspired by well-established robotic architectures such as ROS, PX4, and Ardupilot. The study is also applicable to other software systems that share similar characteristics with robotic systems. We consider the use of JSON or Domain Specific Languages (DSL) development with tools such as Antlr and automatic code and configuration file generation for frameworks such as ROS and PX4. A case study with ROS2 is presented as a proof of concept for the proposed methodology.Keywords: CPS, robots, software architecture, interface, ROS, autopilot
Procedia PDF Downloads 9217187 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings
Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.Keywords: building system, time series, diagnosis, outliers, delay, data gap
Procedia PDF Downloads 24517186 Investigation of Some Flotation Parameters and the Role of Dispersants in the Flotation of Chalcopyrite
Authors: H. A. Taner, V. Önen
Abstract:
A suitable choice of flotation parameters and reagents have a strong effect on the effectiveness of flotation process. The objective of this paper is to give an overview of the flotation of chalcopyrite with the different conditions and dispersants. Flotation parameters such as grinding time, pH, type, and dosage of dispersant were investigated. In order to understand the interaction of some dispersants, sodium silicate, sodium hexametaphosphate and sodium polyphosphate were used. The optimum results were obtained at a pH of 11.5 and a grinding time of 10 minutes. A copper concentrate was produced assaying 29.85% CuFeS2 and 65.97% flotation recovery under optimum rougher flotation conditions with sodium silicate.Keywords: chalcopyrite, dispersant, flotation, reagent
Procedia PDF Downloads 18217185 A Continuous Real-Time Analytic for Predicting Instability in Acute Care Rapid Response Team Activations
Authors: Ashwin Belle, Bryce Benson, Mark Salamango, Fadi Islim, Rodney Daniels, Kevin Ward
Abstract:
A reliable, real-time, and non-invasive system that can identify patients at risk for hemodynamic instability is needed to aid clinicians in their efforts to anticipate patient deterioration and initiate early interventions. The purpose of this pilot study was to explore the clinical capabilities of a real-time analytic from a single lead of an electrocardiograph to correctly distinguish between rapid response team (RRT) activations due to hemodynamic (H-RRT) and non-hemodynamic (NH-RRT) causes, as well as predict H-RRT cases with actionable lead times. The study consisted of a single center, retrospective cohort of 21 patients with RRT activations from step-down and telemetry units. Through electronic health record review and blinded to the analytic’s output, each patient was categorized by clinicians into H-RRT and NH-RRT cases. The analytic output and the categorization were compared. The prediction lead time prior to the RRT call was calculated. The analytic correctly distinguished between H-RRT and NH-RRT cases with 100% accuracy, demonstrating 100% positive and negative predictive values, and 100% sensitivity and specificity. In H-RRT cases, the analytic detected hemodynamic deterioration with a median lead time of 9.5 hours prior to the RRT call (range 14 minutes to 52 hours). The study demonstrates that an electrocardiogram (ECG) based analytic has the potential for providing clinical decision and monitoring support for caregivers to identify at risk patients within a clinically relevant timeframe allowing for increased vigilance and early interventional support to reduce the chances of continued patient deterioration.Keywords: critical care, early warning systems, emergency medicine, heart rate variability, hemodynamic instability, rapid response team
Procedia PDF Downloads 14317184 Modification Of Rubber Swab Tool With Brush To Reduce Rubber Swab Fraction Fishing Time
Authors: T. R. Hidayat, G. Irawan, F. Kurniawan, E. H. I. Prasetya, Suharto, T. F. Ridwan, A. Pitoyo, A. Juniantoro, R. T. Hidayat
Abstract:
Swab activities is an activity to lift fluid from inside the well with the use of a sand line that aims to find out fluid influx after conducting perforation or to reduce the level of fluid as an effort to get the difference between formation pressure with hydrostatic pressure in the well for underbalanced perforation. During the swab activity, problems occur frequent problems occur with the rubber swab. The rubber swab often breaks and becomes a fish inside the well. This rubber swab fishing activity caused the rig operation takes longer, the swab result data becomes too late and create potential losses of well operation for the company. The average time needed for fishing the fractions of rubber swab plus swab work is 42 hours. Innovation made for such problems is to modify the rubber swab tool. The rubber swab tool is modified by provided a series of brushes at the end part of the tool with a thread of connection in order to improve work safety, so when the rubber swab breaks, the broken swab will be lifted by the brush underneath; therefore, it reduces the loss time for rubber swab fishing. This tool has been applied, it and is proven that with this rubber swab tool modification, the rig operation becomes more efficient because it does not carry out the rubber swab fishing activity. The fish fractions of the rubber swab are lifted up to the surface. Therefore, it saves the fuel cost, and well production potentials are obtained. The average time to do swab work after the application of this modified tool is 8 hours.Keywords: rubber swab, modifikasi swab, brush, fishing rubber swab, saving cost
Procedia PDF Downloads 16717183 General Purpose Graphic Processing Units Based Real Time Video Tracking System
Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai
Abstract:
Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.Keywords: connected components, embrace threads, local weighted kernel, structuring elements
Procedia PDF Downloads 44017182 Removal of Vanadium from Industrial Effluents by Natural Ion Exchanger
Authors: Shashikant R. Kuchekar, Haribhau R. Aher, Priti M. Dhage
Abstract:
The removal vanadium from aqueous solution using natural exchanger was investigated. The effects of pH, contact time and exchanger dose were studied at ambient temperature (25 0C ± 2 0C). The equilibrium process was described by the Langmuir isotherm model with adsorption capacity for vanadium. The natural exchanger i.e. tamarindus seeds powder was treated with formaldehyde and sulpuric acid to increase the adsorptivity of metals. The maximum exchange level was attained as 80.1% at pH 3 with exchanger dose 5 g and contact time 60 min. Method is applied for removal of vanadium from industrial effluents.Keywords: industrial effluent, natural ion exchange, Tamarindous indica, vanadium
Procedia PDF Downloads 25117181 A Mathematical Model of Blood Perfusion Dependent Temperature Distribution in Transient Case in Human Dermal Region
Authors: Yogesh Shukla
Abstract:
Many attempts have been made to study temperature distribution problem in human tissues under normal environmental and physiological conditions at constant arterial blood temperature. But very few attempts have been made to investigate temperature distribution in human tissues under different arterial blood temperature. In view of above, a finite element model has been developed to unsteady temperature distribution in dermal region in human body. The model has been developed for one dimension unsteady state case. The variation in parameters like thermal conductivity, blood mass flow and metabolic activity with respect to position and time has been incorporated in the model. Appropriate boundary conditions have been framed. The central difference approach has been used in space variable and trapezoidal rule has been employed a long time variable. Numerical results have been obtained to study relationship among temperature and time.Keywords: rate of metabolism, blood mass flow rate, thermal conductivity, heat generation, finite element method
Procedia PDF Downloads 35317180 On the Dwindling Supply of the Observable Cosmic Microwave Background Radiation
Authors: Jia-Chao Wang
Abstract:
The cosmic microwave background radiation (CMB) freed during the recombination era can be considered as a photon source of small duration; a one-time event happened everywhere in the universe simultaneously. If space is divided into concentric shells centered at an observer’s location, one can imagine that the CMB photons originated from the nearby shells would reach and pass the observer first, and those in shells farther away would follow as time goes forward. In the Big Bang model, space expands rapidly in a time-dependent manner as described by the scale factor. This expansion results in an event horizon coincident with one of the shells, and its radius can be calculated using cosmological calculators available online. Using Planck 2015 results, its value during the recombination era at cosmological time t = 0.379 million years (My) is calculated to be Revent = 56.95 million light-years (Mly). The event horizon sets a boundary beyond which the freed CMB photons will never reach the observer. The photons within the event horizon also exhibit a peculiar behavior. Calculated results show that the CMB observed today was freed in a shell located at 41.8 Mly away (inside the boundary set by Revent) at t = 0.379 My. These photons traveled 13.8 billion years (Gy) to reach here. Similarly, the CMB reaching the observer at t = 1, 5, 10, 20, 40, 60, 80, 100 and 120 Gy are calculated to be originated at shells of R = 16.98, 29.96, 37.79, 46.47, 53.66, 55.91, 56.62, 56.85 and 56.92 Mly, respectively. The results show that as time goes by, the R value approaches Revent = 56.95 Mly but never exceeds it, consistent with the earlier statement that beyond Revent the freed CMB photons will never reach the observer. The difference Revert - R can be used as a measure of the remaining observable CMB photons. Its value becomes smaller and smaller as R approaching Revent, indicating a dwindling supply of the observable CMB radiation. In this paper, detailed dwindling effects near the event horizon are analyzed with the help of online cosmological calculators based on the lambda cold dark matter (ΛCDM) model. It is demonstrated in the literature that assuming the CMB to be a blackbody at recombination (about 3000 K), then it will remain so over time under cosmological redshift and homogeneous expansion of space, but with the temperature lowered (2.725 K now). The present result suggests that the observable CMB photon density, besides changing with space expansion, can also be affected by the dwindling supply associated with the event horizon. This raises the question of whether the blackbody of CMB at recombination can remain so over time. Being able to explain the blackbody nature of the observed CMB is an import part of the success of the Big Bang model. The present results cast some doubts on that and suggest that the model may have an additional challenge to deal with.Keywords: blackbody of CMB, CMB radiation, dwindling supply of CMB, event horizon
Procedia PDF Downloads 11917179 Bianchi Type- I Viscous Fluid Cosmological Models with Stiff Matter and Time Dependent Λ- Term
Authors: Rajendra Kumar Dubey
Abstract:
Einstein’s field equations with variable cosmological term Λ are considered in the presence of viscous fluid for Bianchi type I space time. Exact solutions of Einstein’s field equations are obtained by assuming cosmological term Λ Proportional to (R is a scale factor and m is constant). We observed that the shear viscosity is found to be responsible for faster removal of initial anisotropy in the universe. The physical significance of the cosmological models has also been discussed.Keywords: bianchi type, I cosmological model, viscous fluid, cosmological constant Λ
Procedia PDF Downloads 528