Search results for: explanations for the probable causes of the errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1404

Search results for: explanations for the probable causes of the errors

744 Ophthalmic Services Covered by Albasar International Foundation in Sudan

Authors: Mohammad Ibrahim

Abstract:

The study was conducted at Albasar international foundation ophthalmic hospitals in Sudan to study the burden and patterns of ophthalmic disorder in the sector. Review of the hospitals records revealed that the total number of patient examined in the hospitals and outreached camps conducted by the hospitals is 10,513,874, the total number of surgeries is 694,015 and the total number of pupils at school program is 230,382. The organization working with the highest management system and standards and quality result based planning. The study yielded that the ophthalmic problem in Sudan are of great percentage and the temporal blindness disorder are high since major cases and surgeries were Cataract (57.8%). Retinal problem (2.9%), Glaucoma (2.4%), Orbit and Occulo-plastic disorders (2.2%) other disorders are refractive errors, squint and strabismus, Corneal, Pediatrics and minor ophthalmic disorders.

Keywords: hospitals and outreach ophthalmic services, largest coverage of ophthalmic services, nonprofitable ophthalmic services, strong management system and standards

Procedia PDF Downloads 414
743 Evaluation of Redundancy Architectures Based on System on Chip Internal Interfaces for Future Unmanned Aerial Vehicles Flight Control Computer

Authors: Sebastian Hiergeist

Abstract:

It is a common view that Unmanned Aerial Vehicles (UAV) tend to migrate into the civil airspace. This trend is challenging UAV manufacturer in plenty ways, as there come up a lot of new requirements and functional aspects. On the higher application levels, this might be collision detection and avoidance and similar features, whereas all these functions only act as input for the flight control components of the aircraft. The flight control computer (FCC) is the central component when it comes up to ensure a continuous safe flight and landing. As these systems are flight critical, they have to be built up redundantly to be able to provide a Fail-Operational behavior. Recent architectural approaches of FCCs used in UAV systems are often based on very simple microprocessors in combination with proprietary Application-Specific Integrated Circuit (ASIC) or Field Programmable Gate Array (FPGA) extensions implementing the whole redundancy functionality. In the future, such simple microprocessors may not be available anymore as they are more and more replaced by higher sophisticated System on Chip (SoC). As the avionic industry cannot provide enough market power to significantly influence the development of new semiconductor products, the use of solutions from foreign markets is almost inevitable. Products stemming from the industrial market developed according to IEC 61508, or automotive SoCs, according to ISO 26262, can be seen as candidates as they have been developed for similar environments. Current available SoC from the industrial or automotive sector provides quite a broad selection of interfaces like, i.e., Ethernet, SPI or FlexRay, that might come into account for the implementation of a redundancy network. In this context, possible network architectures shall be investigated which could be established by using the interfaces stated above. Of importance here is the avoidance of any single point of failures, as well as a proper segregation in distinct fault containment regions. The performed analysis is supported by the use of guidelines, published by the aviation authorities (FAA and EASA), on the reliability of data networks. The main focus clearly lies on the reachable level of safety, but also other aspects like performance and determinism play an important role and are considered in the research. Due to the further increase in design complexity of recent and future SoCs, also the risk of design errors, which might lead to common mode faults, increases. Thus in the context of this work also the aspect of dissimilarity will be considered to limit the effect of design errors. To achieve this, the work is limited to broadly available interfaces available in products from the most common silicon manufacturer. The resulting work shall support the design of future UAV FCCs by giving a guideline on building up a redundancy network between SoCs, solely using on board interfaces. Therefore the author will provide a detailed usability analysis on available interfaces provided by recent SoC solutions, suggestions on possible redundancy architectures based on these interfaces and an assessment of the most relevant characteristics of the suggested network architectures, like e.g. safety or performance.

Keywords: redundancy, System-on-Chip, UAV, flight control computer (FCC)

Procedia PDF Downloads 221
742 Functioning of Public Distribution System and Calories Intake in the State of Maharashtra

Authors: Balasaheb Bansode, L. Ladusingh

Abstract:

The public distribution system is an important component of food security. It is a massive welfare program undertaken by Government of India and implemented by state government since India being a federal state; for achieving multiple objectives like eliminating hunger, reduction in malnutrition and making food consumption affordable. This program reaches at the community level through the various agencies of the government. The paper focuses on the accessibility of PDS at household level and how the present policy framework results in exclusion and inclusion errors. It tries to explore the sanctioned food grain quantity received by differentiated ration cards according to income criterion at household level, and also it has highlighted on the type of corruption in food distribution that is generated by the PDS system. The data used is of secondary nature from NSSO 68 round conducted in 2012. Bivariate and multivariate techniques have been used to understand the working and consumption of food for this paper.

Keywords: calories intake, entitle food quantity, poverty aliviation through PDS, target error

Procedia PDF Downloads 336
741 Response of Wheat and Lentil to Herbicides Applied in the Preceding Non-Puddled Transplanted Rainy Season Rice

Authors: Taslima Zahan

Abstract:

A field study was done in 2013-14 and 2014-15 by following bio-assay technique to determine the carryover effect of herbicides applied in rainy season rice on growth and yield of two probable succeeding crops of rice viz., wheat and lentil. Rice seedlings were transplanted on strip-tilled non-puddled field, and five herbicides named pyrazosufuron-ethyl, butachlor, orthosulfamuron, butachlor + propanil and 2,4-D amine were applied in rice at their recommended rate and time as eight treatment combinations and compared with one untreated control. Residual effects of those rice herbicides on the succeeding wheat and lentil were examined by following micro-plot bioassay technique. The study revealed that germination of wheat and lentil seeds were not affected by the residue of herbicides applied in the preceding rainy season rice. Shoot length of wheat and lentil seedlings of herbicide treated plots were also non-significantly varied with untreated control plots. Herbicide treated plots of wheat had higher leaf chlorophyll contents over the control plots by 1.8-14.0% on an average while in case of lentil herbicide treated plots had negligible amount of reduction in leaf chlorophyll contents than control plots. Grain yields of wheat and lentil in herbicide treated plots were higher than control plots by 2.8-6.6% and 0.2-10.9%, respectively. Therefore, two-year bioassay study claimed that tested herbicides applied in rainy season rice under strip-tilled non-puddled field had no adverse residual effect on growth and yield of the succeeding wheat and lentil.

Keywords: crop sensitivity, herbicide persistence, minimum tillage rice, yield improvement

Procedia PDF Downloads 161
740 Simulation and Synoptic Investigation of a Severe Dust Storm in Urmia Lake in the Middle East

Authors: Nasim Hossein Hamzeh, Karim Shukurov, Abbas Ranjbar Saadat Abadi, Alaa Mhawish, Christian Opp

Abstract:

Deserts are the main dust sources in the world. Also, recently driedLake beds have caused environmental problems inthe surrounding areas in the world. In this study, the Urmia Lake was the source of dustfromApril 24 to April 25, 2017.The local dust storm was combined with another large-scale dust storm that originated from Saudi Arabia and Iraq 1-2 days earlier. Synoptic investigation revealed that the severe dust storm was made by a strong Black Sea cyclone and a low-pressure system over the Middle East and Central Iraq in conjunction a high-pressure system and associated with a high gradient contour and a quasi-stationary long-wave trough over the east and south of the Mediterranean Sea. Based on HYSPLIT 72 hours backward and forward trajectories, the most probable dust transport routes to and from the Urmia Lake region are estimated. Using the concentration weighted trajectory (CWT) method based on 24 hours backward and 24 hours forward trajectories, the spatial distributions of potential sources of PM10 observed in the Urmia Lake region on April 23-26, 2017. Also, the vertical profile of dust particles using the WRF-Chem model with two dust schemes showed dust ascending up to 5 km from the lake. Also, the dust schemes outputs shows that the PM10 fluctuating changes are 12 hours earlier than the measured surface PM10 at five air pollution monitoring stations around the Urmia Lake in 23-26 April 2017.

Keywords: dust storm, synoptic investigation, WRF-chem model, urmia lake, lagrangian trajectory

Procedia PDF Downloads 214
739 Unveiling Special Policy Regime, Judgment, and Taylor Rules in Tunisia

Authors: Yosra Baaziz, Moez Labidi

Abstract:

Given limited research on monetary policy rules in revolutionary countries, this paper challenges the suitability of the Taylor rule in characterizing the monetary policy behavior of the Tunisian Central Bank (BCT), especially in turbulent times. More specifically, we investigate the possibility that the Taylor rule should be formulated as a threshold process and examine the validity of such nonlinear Taylor rule as a robust rule for conducting monetary policy in Tunisia. Using quarterly data from 1998:Q4 to 2013:Q4 to analyze the movement of nominal short-term interest rate of the BCT, we find that the nonlinear Taylor rule improves its performance with the advent of special events providing thus a better description of the Tunisian interest rate setting. In particular, our results show that the adoption of an appropriate nonlinear approach leads to a reduction in the errors of 150 basis points in 1999 and 2009, and 60 basis points in 2011, relative to the linear approach.

Keywords: policy rule, central bank, exchange rate, taylor rule, nonlinearity

Procedia PDF Downloads 296
738 Challenges and Recommendations for Medical Device Tracking and Traceability in Singapore: A Focus on Nursing Practices

Authors: Zhuang Yiwen

Abstract:

The paper examines the challenges facing the Singapore healthcare system related to the tracking and traceability of medical devices. One of the major challenges identified is the lack of a standard coding system for medical devices, which makes it difficult to track them effectively. The paper suggests the use of the Unique Device Identifier (UDI) as a single standard for medical devices to improve tracking and reduce errors. The paper also explores the use of barcoding and image recognition to identify and document medical devices in nursing practices. In nursing practices, the use of barcodes for identifying medical devices is common. However, the information contained in these barcodes is often inconsistent, making it challenging to identify which segment contains the model identifier. Moreover, the use of barcodes may be improved with the use of UDI, but many subsidized accessories may still lack barcodes. The paper suggests that the readiness for UDI and barcode standardization requires standardized information, fields, and logic in electronic medical record (EMR), operating theatre (OT), and billing systems, as well as barcode scanners that can read various formats and selectively parse barcode segments. Nursing workflow and data flow also need to be taken into account. The paper also explores the use of image recognition, specifically the Tesseract OCR engine, to identify and document implants in public hospitals due to limitations in barcode scanning. The study found that the solution requires an implant information database and checking output against the database. The solution also requires customization of the algorithm, cropping out objects affecting text recognition, and applying adjustments. The solution requires additional resources and costs for a mobile/hardware device, which may pose space constraints and require maintenance of sterile criteria. The integration with EMR is also necessary, and the solution require changes in the user's workflow. The paper suggests that the long-term use of Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) as a supporting terminology to improve clinical documentation and data exchange in healthcare. SNOMED CT provides a standardized way of documenting and sharing clinical information with respect to procedure, patient and device documentation, which can facilitate interoperability and data exchange. In conclusion, the paper highlights the challenges facing the Singapore healthcare system related to the tracking and traceability of medical devices. The paper suggests the use of UDI and barcode standardization to improve tracking and reduce errors. It also explores the use of image recognition to identify and document medical devices in nursing practices. The paper emphasizes the importance of standardized information, fields, and logic in EMR, OT, and billing systems, as well as barcode scanners that can read various formats and selectively parse barcode segments. These recommendations could help the Singapore healthcare system to improve tracking and traceability of medical devices and ultimately enhance patient safety.

Keywords: medical device tracking, unique device identifier, barcoding and image recognition, systematized nomenclature of medicine clinical terms

Procedia PDF Downloads 79
737 Assessment of Menus in a Selected Social Welfare Home with Regard to Nutritional Recommendations

Authors: E. Grochowska-Niedworok, K. Brukalo, B. Całyniuk, J. Piekorz, M. Kardas

Abstract:

The aim of the study was to assess diets of residents of nursing homes. Provided by social welfare home, 10 day menus were introduced into the computer program Diet 5 and analyzed in respect of protein, fats, carbohydrates, energy, vitamin D and calcium. The resulting mean values of 10-day menus were compared with the existing Nutrition Standards for Polish population. The analysis menus showed that the average amount of energy supplied from food is not sufficient. Carbohydrates in food supply are too high and represent 257% of normal. The average value of fats and proteins supplied with food is adequate 85.2 g/day and 75.2 g/day. The calcium content of the diet is 513.9 mg/day. The amount of vitamin D supplied in the age group 51-65 years is 2.3 µg/day. Dietary errors that have been shown are due to the lack of detailed nutritional guidelines for nursing homes, as well as state-owned care facilities in general.

Keywords: assessment of diet, essential nutrients, social welfare home, nutrition

Procedia PDF Downloads 152
736 Study of ANFIS and ARIMA Model for Weather Forecasting

Authors: Bandreddy Anand Babu, Srinivasa Rao Mandadi, C. Pradeep Reddy, N. Ramesh Babu

Abstract:

In this paper quickly illustrate the correlation investigation of Auto-Regressive Integrated Moving and Average (ARIMA) and daptive Network Based Fuzzy Inference System (ANFIS) models done by climate estimating. The climate determining is taken from University of Waterloo. The information is taken as Relative Humidity, Ambient Air Temperature, Barometric Pressure and Wind Direction utilized within this paper. The paper is carried out by analyzing the exhibitions are seen by demonstrating of ARIMA and ANIFIS model like with Sum of average of errors. Versatile Network Based Fuzzy Inference System (ANFIS) demonstrating is carried out by Mat lab programming and Auto-Regressive Integrated Moving and Average (ARIMA) displaying is produced by utilizing XLSTAT programming. ANFIS is carried out in Fuzzy Logic Toolbox in Mat Lab programming.

Keywords: ARIMA, ANFIS, fuzzy surmising tool stash, weather forecasting, MATLAB

Procedia PDF Downloads 420
735 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies

Authors: Yuanjin Liu

Abstract:

Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.

Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model

Procedia PDF Downloads 74
734 Partial Purification and Characterization of a Low Molecular Weight and Industrially Important Chitinase and a Chitin Deacetylase Enzyme from Streptomyces Chilikensis RC1830, a Novel Strain Isolated from Chilika Lake, India

Authors: Lopamudra Ray, Malla Padma, Dibya Bhol, Samir Ranjan Mishra, A. N. Panda, Gurdeep Rastogi, T. K. Adhya, Ajit Kumar Pattnaik, Mrutyunjay Suar, Vishakha Raina

Abstract:

Chilika Lake is the largest coastal estuarine brackish water lagoon in Asia situated on the east coast of India and is a designated Ramsar site. In the current study, several chitinolytic microorganisms were isolated and screened by appearance of clearance zone on 0.5% colloidal chitin agar plate. A strain designated as RC 1830 displayed maximum colloidal chitin degradation by release of 112 μmol/ml/min of N-acetyl D-glucosamine (GlcNAc) in 48h. The strain was taxonomically identified by polyphasic approach based on a range of phenotypic and genotypic properties and was found to be a novel species named Streptomyces chilikensis RC1830. The organism was halophilic (12% NaCl w/v), alkalophilic (pH10) and was capable of hydrolyzing chitin, starch, cellulose, gelatin, casein, tributyrin and tween 80. The partial purification of chitinase enzymes from RC1830 was performed by DEAE Sephacel anion exchange chromatography which revealed the presence of a very low molecular weight chitinase(10.5kD) which may be a probable chitobiosidase enzyme. The study reports the presence of a low MW chitinase (10.5kD) and a chitin decaetylase from a novel Streptomyces strain RC1830 isolated from Chilika Lake. Previously chitinases less than 20.5kD have not been reported from any other Streptomyces species. The enzymes was characterized with respect to optimum pH, temperature, and substrate specificity and temperature stability.

Keywords: chitinases, chitobiosidase, Chilika Lake, India

Procedia PDF Downloads 501
733 Blended Learning in a Mathematics Classroom: A Focus in Khan Academy

Authors: Sibawu Witness Siyepu

Abstract:

This study explores the effects of instructional design using blended learning in the learning of radian measures among Engineering students. Blended learning is an education programme that combines online digital media with traditional classroom methods. It requires the physical presence of both lecturer and student in a mathematics computer laboratory. Blended learning provides element of class control over time, place, path or pace. The focus was on the use of Khan Academy to supplement traditional classroom interactions. Khan Academy is a non-profit educational organisation created by educator Salman Khan with a goal of creating an accessible place for students to learn through watching videos in a computer assisted computer. The researcher who is an also lecturer in mathematics support programme collected data through instructing students to watch Khan Academy videos on radian measures, and by supplying students with traditional classroom activities. Classroom activities entails radian measure activities extracted from the Internet. Students were given an opportunity to engage in class discussions, social interactions and collaborations. These activities necessitated students to write formative assessments tests. The purpose of formative assessments tests was to find out about the students’ understanding of radian measures, including errors and misconceptions they displayed in their calculations. Identification of errors and misconceptions serve as pointers of students’ weaknesses and strengths in their learning of radian measures. At the end of data collection, semi-structure interviews were administered to a purposefully sampled group to explore their perceptions and feedback regarding the use of blended learning approach in teaching and learning of radian measures. The study employed Algebraic Insight Framework to analyse data collected. Algebraic Insight Framework is a subset of symbol sense which allows a student to correctly enter expressions into a computer assisted systems efficiently. This study offers students opportunities to enter topics and subtopics on radian measures into a computer through the lens of Khan Academy. Khan academy demonstrates procedures followed to reach solutions of mathematical problems. The researcher performed the task of explaining mathematical concepts and facilitated the process of reinvention of rules and formulae in the learning of radian measures. Lastly, activities that reinforce students’ understanding of radian were distributed. Results showed that this study enthused the students in their learning of radian measures. Learning through videos prompted the students to ask questions which brought about clarity and sense making to the classroom discussions. Data revealed that sense making through reinvention of rules and formulae assisted the students in enhancing their learning of radian measures. This study recommends the use of Khan Academy in blended learning to be introduced as a socialisation programme to all first year students. This will prepare students that are computer illiterate to become conversant with the use of Khan Academy as a powerful tool in the learning of mathematics. Khan Academy is a key technological tool that is pivotal for the development of students’ autonomy in the learning of mathematics and that promotes collaboration with lecturers and peers.

Keywords: algebraic insight framework, blended learning, Khan Academy, radian measures

Procedia PDF Downloads 311
732 Trinary Affinity—Mathematic Verification and Application (1): Construction of Formulas for the Composite and Prime Numbers

Authors: Liang Ming Zhong, Yu Zhong, Wen Zhong, Fei Fei Yin

Abstract:

Trinary affinity is a description of existence: every object exists as it is known and spoken of, in a system of 2 differences (denoted dif1, dif₂) and 1 similarity (Sim), equivalently expressed as dif₁ / Sim / dif₂ and kn / 0 / tkn (kn = the known, tkn = the 'to be known', 0 = the zero point of knowing). They are mathematically verified and illustrated in this paper by the arrangement of all integers onto 3 columns, where each number exists as a difference in relation to another number as another difference, and the 2 difs as arbitrated by a third number as the Sim, resulting in a trinary affinity or trinity of 3 numbers, of which one is the known, the other the 'to be known', and the third the zero (0) from which both the kn and tkn are measured and specified. Consequently, any number is horizontally specified either as 3n, or as '3n – 1' or '3n + 1', and vertically as 'Cn + c', so that any number seems to occur at the intersection of its X and Y axes and represented by its X and Y coordinates, as any point on Earth’s surface by its latitude and longitude. Technically, i) primes are viewed and treated as progenitors, and composites as descending from them, forming families of composites, each capable of being measured and specified from its own zero called in this paper the realistic zero (denoted 0r, as contrasted to the mathematic zero, 0m), which corresponds to the constant c, and the nature of which separates the composite and prime numbers, and ii) any number is considered as having a magnitude as well as a position, so that a number is verified as a prime first by referring to its descriptive formula and then by making sure that no composite number can possibly occur on its position, by dividing it with factors provided by the composite number formulas. The paper consists of 3 parts: 1) a brief explanation of the trinary affinity of things, 2) the 8 formulas that represent ALL the primes, and 3) families of composite numbers, each represented by a formula. A composite number family is described as 3n + f₁‧f₂. Since there are an infinitely large number of composite number families, to verify the primality of a great probable prime, we have to have it divided with several or many a f₁ from a range of composite number formulas, a procedure that is as laborious as it is the surest way to verifying a great number’s primality. (So, it is possible to substitute planned division for trial division.)

Keywords: trinary affinity, difference, similarity, realistic zero

Procedia PDF Downloads 212
731 A Historical Analysis of The Concept of Equivalence from Different Theoretical Perspectives in Translation Studies

Authors: Amenador Kate Benedicta, Wang Zhiwei

Abstract:

Since the later parts of the 20th century, the notion of equivalence continues to be a central and critical concept in the development of translation theory. After decades of arguments over word-for-word and free translations methods, scholars attempting to develop more systematic and efficient translation theories began to focus on fundamental translation concepts such as equivalence. Although the concept of equivalence has piqued the interest of many scholars, its definition, scope, and applicability have sparked contentious arguments within the discipline. As a result, several distinct theories and explanations on the concept of equivalence have been put forward over the last half-century. Thus, this study explores and discusses the evolution of the critical concept of equivalence in translation studies through a bibliometric method of investigation of manual and digital books and articles by analyzing different scholars' key contributions and limitations on equivalence from various theoretical perspectives. While analyzing them, emphasis is placed on the innovations that each theory has brought to the comprehension of equivalence. In order to achieve the aim of the study, the article began by discussing the contributions of linguistically motivated theories to the notion of equivalence in translation, followed by functionalist-oriented contributions, before moving on to more recent advancements in translation studies on the concept. Because equivalence is such a broad notion, it is impossible to discuss each researcher in depth. As a result, the most well-known names and their equivalent theories are compared and contrasted in this research. The study emphasizes the developmental progression in our comprehension of the equivalence concept and equivalent effect. It concluded that the various theoretical perspective's contributions to the notion of equivalence rather complement and make up for the limitations of each other. The study also highlighted how troublesome the equivalent concept might become in terms of identifying the nature of translation and how central and unavoidable the concept is in every translation action, despite its limitations. The significance of the study lies in its synthesis of the different contributions and limitations of the various theories offered by scholars on the notion of equivalence, lending literature to both student and scholars in the field, and providing insight on future theoretical development

Keywords: equivalence, functionalist translation theories, linguistic translation approaches, translation theories, Skopos

Procedia PDF Downloads 113
730 Leverage Effect for Volatility with Generalized Laplace Error

Authors: Farrukh Javed, Krzysztof Podgórski

Abstract:

We propose a new model that accounts for the asymmetric response of volatility to positive ('good news') and negative ('bad news') shocks in economic time series the so-called leverage effect. In the past, asymmetric powers of errors in the conditionally heteroskedastic models have been used to capture this effect. Our model is using the gamma difference representation of the generalized Laplace distributions that efficiently models the asymmetry. It has one additional natural parameter, the shape, that is used instead of power in the asymmetric power models to capture the strength of a long-lasting effect of shocks. Some fundamental properties of the model are provided including the formula for covariances and an explicit form for the conditional distribution of 'bad' and 'good' news processes given the past the property that is important for the statistical fitting of the model. Relevant features of volatility models are illustrated using S&P 500 historical data.

Keywords: heavy tails, volatility clustering, generalized asymmetric laplace distribution, leverage effect, conditional heteroskedasticity, asymmetric power volatility, GARCH models

Procedia PDF Downloads 386
729 Integrated Design of Froth Flotation Process in Sludge Oil Recovery Using Cavitation Nanobubbles for Increase the Efficiency and High Viscose Compatibility

Authors: Yolla Miranda, Marini Altyra, Karina Kalmapuspita Imas

Abstract:

Oily sludge wastes always fill in upstream and downstream petroleum industry process. Sludge still contains oil that can use for energy storage. Recycling sludge is a method to handling it for reduce the toxicity and very probable to get the remaining oil around 20% from its volume. Froth flotation, a common method based on chemical unit for separate fine solid particles from an aqueous suspension. The basic composition of froth flotation is the capture of oil droplets or small solids by air bubbles in an aqueous slurry, followed by their levitation and collection in a froth layer. This method has been known as no intensive energy requirement and easy to apply. But the low efficiency and unable treat the high viscosity become the biggest problem in froth flotation unit. This study give the design to manage the high viscosity of sludge first and then entering the froth flotation including cavitation tube on it to change the bubbles into nano particles. The recovery in flotation starts with the collision and adhesion of hydrophobic particles to the air bubbles followed by transportation of the hydrophobic particle-bubble aggregate from the collection zone to the froth zone, drainage and enrichment of the froth, and finally by its overflow removal from the cell top. The effective particle separation by froth flotation relies on the efficient capture of hydrophobic particles by air bubbles in three steps. The important step is collision. Decreasing the bubble particles will increasing the collision effect. It cause the process more efficient. The pre-treatment, froth flotation, and cavitation tube integrated each other. The design shows the integrated unit and its process.

Keywords: sludge oil recovery, froth flotation, cavitation tube, nanobubbles, high viscosity

Procedia PDF Downloads 381
728 User Guidance for Effective Query Interpretation in Natural Language Interfaces to Ontologies

Authors: Aliyu Isah Agaie, Masrah Azrifah Azmi Murad, Nurfadhlina Mohd Sharef, Aida Mustapha

Abstract:

Natural Language Interfaces typically support a restricted language and also have scopes and limitations that naïve users are unaware of, resulting in errors when the users attempt to retrieve information from ontologies. To overcome this challenge, an auto-suggest feature is introduced into the querying process where users are guided through the querying process using interactive query construction system. Guiding users to formulate their queries, while providing them with an unconstrained (or almost unconstrained) way to query the ontology results in better interpretation of the query and ultimately lead to an effective search. The approach described in this paper is unobtrusive and subtly guides the users, so that they have a choice of either selecting from the suggestion list or typing in full. The user is not coerced into accepting system suggestions and can express himself using fragments or full sentences.

Keywords: auto-suggest, expressiveness, habitability, natural language interface, query interpretation, user guidance

Procedia PDF Downloads 474
727 Development of Variable Order Block Multistep Method for Solving Ordinary Differential Equations

Authors: Mohamed Suleiman, Zarina Bibi Ibrahim, Nor Ain Azeany, Khairil Iskandar Othman

Abstract:

In this paper, a class of variable order fully implicit multistep Block Backward Differentiation Formulas (VOBBDF) using uniform step size for the numerical solution of stiff ordinary differential equations (ODEs) is developed. The code will combine three multistep block methods of order four, five and six. The order selection is based on approximation of the local errors with specific tolerance. These methods are constructed to produce two approximate solutions simultaneously at each iteration in order to further increase the efficiency. The proposed VOBBDF is validated through numerical results on some standard problems found in the literature and comparisons are made with single order Block Backward Differentiation Formula (BBDF). Numerical results shows the advantage of using VOBBDF for solving ODEs.

Keywords: block backward differentiation formulas, uniform step size, ordinary differential equations

Procedia PDF Downloads 447
726 Neural Networks and Genetic Algorithms Approach for Word Correction and Prediction

Authors: Rodrigo S. Fonseca, Antônio C. P. Veiga

Abstract:

Aiming at helping people with some movement limitation that makes typing and communication difficult, there is a need to customize an assistive tool with a learning environment that helps the user in order to optimize text input, identifying the error and providing the correction and possibilities of choice in the Portuguese language. The work presents an Orthographic and Grammatical System that can be incorporated into writing environments, improving and facilitating the use of an alphanumeric keyboard, using a prototype built using a genetic algorithm in addition to carrying out the prediction, which can occur based on the quantity and position of the inserted letters and even placement in the sentence, ensuring the sequence of ideas using a Long Short Term Memory (LSTM) neural network. The prototype optimizes data entry, being a component of assistive technology for the textual formulation, detecting errors, seeking solutions and informing the user of accurate predictions quickly and effectively through machine learning.

Keywords: genetic algorithm, neural networks, word prediction, machine learning

Procedia PDF Downloads 195
725 Framework for Socio-Technical Issues in Requirements Engineering for Developing Resilient Machine Vision Systems Using Levels of Automation through the Lifecycle

Authors: Ryan Messina, Mehedi Hasan

Abstract:

This research is to examine the impacts of using data to generate performance requirements for automation in visual inspections using machine vision. These situations are intended for design and how projects can smooth the transfer of tacit knowledge to using an algorithm. We have proposed a framework when specifying machine vision systems. This framework utilizes varying levels of automation as contingency planning to reduce data processing complexity. Using data assists in extracting tacit knowledge from those who can perform the manual tasks to assist design the system; this means that real data from the system is always referenced and minimizes errors between participating parties. We propose using three indicators to know if the project has a high risk of failing to meet requirements related to accuracy and reliability. All systems tested achieved a better integration into operations after applying the framework.

Keywords: automation, contingency planning, continuous engineering, control theory, machine vision, system requirements, system thinking

Procedia PDF Downloads 209
724 Alternator Fault Detection Using Wigner-Ville Distribution

Authors: Amin Ranjbar, Amir Arsalan Jalili Zolfaghari, Amir Abolfazl Suratgar, Mehrdad Khajavi

Abstract:

This paper describes two stages of learning-based fault detection procedure in alternators. The procedure consists of three states of machine condition namely shortened brush, high impedance relay and maintaining a healthy condition in the alternator. The fault detection algorithm uses Wigner-Ville distribution as a feature extractor and also appropriate feature classifier. In this work, ANN (Artificial Neural Network) and also SVM (support vector machine) were compared to determine more suitable performance evaluated by the mean squared of errors criteria. Modules work together to detect possible faulty conditions of machines working. To test the method performance, a signal database is prepared by making different conditions on a laboratory setup. Therefore, it seems by implementing this method, satisfactory results are achieved.

Keywords: alternator, artificial neural network, support vector machine, time-frequency analysis, Wigner-Ville distribution

Procedia PDF Downloads 374
723 Filtering and Reconstruction System for Grey-Level Forensic Images

Authors: Ahd Aljarf, Saad Amin

Abstract:

Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.

Keywords: image filtering, image reconstruction, image processing, forensic images

Procedia PDF Downloads 367
722 A Method of Effective Planning and Control of Industrial Facility Energy Consumption

Authors: Aleksandra Aleksandrovna Filimonova, Lev Sergeevich Kazarinov, Tatyana Aleksandrovna Barbasova

Abstract:

A method of effective planning and control of industrial facility energy consumption is offered. The method allows to optimally arrange the management and full control of complex production facilities in accordance with the criteria of minimal technical and economic losses at the forecasting control. The method is based on the optimal construction of the power efficiency characteristics with the prescribed accuracy. The problem of optimal designing of the forecasting model is solved on the basis of three criteria: maximizing the weighted sum of the points of forecasting with the prescribed accuracy; the solving of the problem by the standard principles at the incomplete statistic data on the basis of minimization of the regularized function; minimizing the technical and economic losses due to the forecasting errors.

Keywords: energy consumption, energy efficiency, energy management system, forecasting model, power efficiency characteristics

Procedia PDF Downloads 393
721 Poor Proficiency of English Language among Tertiary Level Students in Bangladesh and Its Effect on Employability: An Investigation to Find Facts and Solutions

Authors: Tanvir Ahmed, Nahian Fyrose Fahim, Subrata Majumder, Sarker Kibria

Abstract:

English is unanimously recognized as the standard second language in the world, and no one can deny this fact. Many people believe that possessing English proficiency skills is the key to communicating effectively globally, especially for developing countries, which can bring further success to itself on many fronts, as well as to other countries, by ensuring its people worldwide access to education, business, and technology. Bangladesh is a developing country of about 160 million people. A notable number of students in Bangladesh are currently pursuing higher education, especially at the tertiary or collegiate level, in more than 150 public and private universities. English is the dominant linguistic medium through which college instruction and lectures are given to students in Bangladesh. However, many of our students who have only completed their primary and secondary levels of education in the Bangla medium or language are generally in an awkward position to suddenly take and complete many unfamiliar requirements by the time they enter the university as freshmen. As students, they struggle to complete at least 18 courses to acquire proficiency in English. After obtaining a tertiary education certificate, the students could then have the opportunity to acquire a sustainable position in the job market industry; however, many of them do fail, unfortunately, because of poor English proficiency skills. Our study focuses on students in both public and private universities (N=150) as well as education experts (N=30) in Bangladesh. We had prepared two sets of questionnaires that were based upon a literature review on this subject, as we had also collected data and identified the reasons, and arrived at probable solutions to overcoming these problems. After statistical analysis, the study suggested certain remedial measures that could be taken in order to increase student's proficiency in English as well as to ensure their employability potential.

Keywords: tertiary education, English language proficiency, employability, unemployment problems

Procedia PDF Downloads 106
720 Strategic Tools for Entrepreneurship: Model Proposal for Manufacturing Companies

Authors: Chiara Mansanta, Daniela Sani

Abstract:

The present paper presents the further development of the application of a standard methodology to boost innovation inside real case studies of manufacturing companies. The proposed methodology provides a viable solution for manufacturing companies that have to evaluate new business ideas. The study underlined the concept of entrepreneurship and how a manager can use it to promote innovation inside their companies. Starting from a literature study on entrepreneurship, this paper examines the role of the manager in supporting a company’s development. The empirical part of the study is based on two manufacturing companies that used the proposed methodology to favour entrepreneurship through an alternative approach. The research demonstrated the need for companies to have a structured and well-defined methodology to achieve their goals. The purpose of this article is to understand the significance of business models inside companies and explore how they affect business strategy and innovation management. The idea is to use business models to support entrepreneurs in their decision-making processes, reducing risks and avoiding errors.

Keywords: entrepreneurship, manufacturing companies, solution validation, strategic management

Procedia PDF Downloads 96
719 Nationalization of the Social Life in Argentina: Accumulation of Capital, State Intervention, Labor Market, and System of Rights in the Last Decades

Authors: Mauro Cristeche

Abstract:

This work begins with a very simple question: How does the State spend? Argentina is witnessing a process of growing nationalization of social life, so it is necessary to find out the explanations of the phenomenon on the specific dynamic of the capitalist mode of production in Argentina and its transformations in the last decades. Then the new question is: what happened in Argentina that could explain this phenomenon? Since the seventies, the capital growth in Argentina faces deep competitive problems. Until that moment the agrarian wealth had worked as a compensation mechanism, but it began to find its limits. In the meantime, some important demographical and structural changes had happened. The strategy of the capitalist class had to become to seek in the cheapness of the labor force the main source of compensation of its weakness. As a result, a tendency to worsen the living conditions and fragmentation of the working class started to develop, manifested by unemployment, underemployment, and the fall of the purchasing power of the salary as a highlighted fact. As a consequence, it is suggested that the role of the State became stronger and public expenditure increased, as a historical trend, because it has to intervene to face the contradictions and constant growth problems posed by the development of capitalism in Argentina. On the one hand, the State has to guarantee the process of buying the cheapened workforce and at the same time the process of reproduction of the working class. On the other hand, it has to help to reproduce the individual capitals but needs to ‘attack’ them in different ways. This is why the role of the State is said to be the general political representative to the national portion of the total social capital. What will be studied is the dynamic of the intervention of the Argentine State in the context of the particular national process of capital growth, and its dynamics in the last decades. What this paper wants to show are the main general causes that could explain the phenomenon of nationalization of the social life and how it has impacted the life conditions of the working class and the system of rights.

Keywords: Argentina, nationalization, public policies, rights, state

Procedia PDF Downloads 137
718 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 242
717 Pigging Operation in Two-Phase Flow Pipeline- Empirical and Simulation

Authors: Behnaz Jamshidi, Seyed Hassan Hashemabadi

Abstract:

The main objective of this study is to investigate on pigging operation of two phase flow pipeline and compare the empirical and simulation results for 108 km long , 0.7934 mm (32 inches) diameter sea line of "Phase 1 South Pars Gas Complex", located in south of Iran. The pigging time, pig velocity, the amount of slug and slug catcher pressure were calculated and monitored closely as the key parameters. Simulation was done by "OLGA" dynamic simulation software and obtained results were compared and validated with empirical data in real operation. The relative errors between empirical data and simulation of the process were 3 % and 9 % for pigging time and accumulated slug volume respectively. Simulated pig velocity and changes of slug catcher pressure were consistent with real values, too. It was also found the slug catcher and condensate stabilization units have been adequately sized for gas-liquid separation and handle the slug batch during transient conditions such as pigging and start up.

Keywords: sea line, pigging, slug catcher, two-phase flow, dynamic simulation

Procedia PDF Downloads 509
716 An Experimental Study of the Parameters Affecting the Compression Index of Clay Soil

Authors: Rami Rami Mahmoud Bakr

Abstract:

The constant rate of strain (CRS) test is a rapid technique that effectively measures specific properties of cohesive soil, including the rate of consolidation, hydraulic conductivity, compressibility, and stress history. Its simple operation and frequent readings enable efficient definition, especially of the compression curve. However, its limitations include an inability to handle strain-rate-dependent soil behavior, initial transient conditions, and pore pressure evaluation errors. There are currently no effective techniques for interpreting CRS data. In this study, experiments were performed to evaluate the effects of different parameters on CRS results. Extensive tests were performed on two types of clay to analyze the soil behavior during strain consolidation at a constant rate. The results were used to evaluate the transient conditions and pore pressure system.

Keywords: constant rate of strain (CRS), resedimented boston blue clay (RBBC), resedimented vicksburg buckshot clay (RVBC), compression index

Procedia PDF Downloads 43
715 Model for Introducing Products to New Customers through Decision Tree Using Algorithm C4.5 (J-48)

Authors: Komol Phaisarn, Anuphan Suttimarn, Vitchanan Keawtong, Kittisak Thongyoun, Chaiyos Jamsawang

Abstract:

This article is intended to analyze insurance information which contains information on the customer decision when purchasing life insurance pay package. The data were analyzed in order to present new customers with Life Insurance Perfect Pay package to meet new customers’ needs as much as possible. The basic data of insurance pay package were collect to get data mining; thus, reducing the scattering of information. The data were then classified in order to get decision model or decision tree using Algorithm C4.5 (J-48). In the classification, WEKA tools are used to form the model and testing datasets are used to test the decision tree for the accurate decision. The validation of this model in classifying showed that the accurate prediction was 68.43% while 31.25% were errors. The same set of data were then tested with other models, i.e. Naive Bayes and Zero R. The results showed that J-48 method could predict more accurately. So, the researcher applied the decision tree in writing the program used to introduce the product to new customers to persuade customers’ decision making in purchasing the insurance package that meets the new customers’ needs as much as possible.

Keywords: decision tree, data mining, customers, life insurance pay package

Procedia PDF Downloads 429