Search results for: mining technique
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7501

Search results for: mining technique

6931 Evaluation of Hard Rocks Destruction Effectiveness at Drilling

Authors: Ekaterina Leusheva, Valentin Morenov

Abstract:

Well drilling in hard rocks is coupled with high energy demands which negates the speed of the process and thus reduces overall effectiveness. Aim of this project is to develop the technique of experimental research, which would allow to select optimal washing fluid composition while adding special hardness reducing detergent reagents. Based on the analysis of existing references and conducted experiments, technique dealing with quantitative evaluation of washing fluid weakening influence on drilled rocks was developed, which considers laboratory determination of three mud properties (density, surface tension, specific electrical resistance) and three rock properties (ultimate stress, dynamic strength, micro-hardness). Developed technique can be used in the well drilling technologies and particularly while creating new compositions of drilling muds for increased destruction effectiveness of hard rocks. It can be concluded that given technique introduces coefficient of hard rocks destruction effectiveness that allows quantitative evaluation of different drilling muds on the drilling process to be taken. Correct choice of drilling mud composition with hardness reducing detergent reagents will increase drilling penetration rate and drill meterage per bit.

Keywords: detergent reagents, drilling mud, drilling process stimulation, hard rocks

Procedia PDF Downloads 541
6930 Operative Technique of Glenoid Anteversion Osteotomy and Soft Tissue Rebalancing for Brachial Plexus Birth Palsy

Authors: Michael Zaidman, Naum Simanovsky

Abstract:

The most of brachial birth palsies are transient. Children with incomplete recovery almost always develop an internal rotation and adduction contracture. The muscle imbalance around the shoulder results in glenohumeral joint deformity and functional limitations. Natural history of glenohumeral deformity is it’s progression with worsening of function. Anteversion glenoid osteotomy with latissimus dorsi and teres major tendon transfers could be an alternative procedure of proximal humeral external rotation osteotomy for patients with severe glenohumeral dysplasia secondary to brachial plexus birth palsy. We will discuss pre-operative planning and stepped operative technique of the procedure on clinical example.

Keywords: obstetric brachial plexus palsy, glenoid anteversion osteotomy, tendon transfer, operative technique

Procedia PDF Downloads 66
6929 Comparative Study of Active Release Technique and Myofascial Release Technique in Patients with Upper Trapezius Spasm

Authors: Harihara Prakash Ramanathan, Daksha Mishra, Ankita Dhaduk

Abstract:

Relevance: This qualitative study will educate the clinician in putting into practice the advanced method of movement science in restoring the function. Purpose: The purpose of this study is to compare the effectiveness of Active Release Technique and myofascial release technique on range of motion, neck function and pain in patients with upper trapezius spasm. Methods/Analysis: The study was approved by the institutional Human Research and Ethics committee. This study included sixty patients of age group between 20 to 55 years with upper trapezius spasm. Patients were randomly divided into two groups receiving Active Release Technique (Group A) and Myofascial Release Technique (Group B). The patients were treated for 1 week and three outcome measures ROM, pain and functional level were measured using Goniometer, Visual analog scale(VAS), Neck disability Index Questionnaire(NDI) respectively. Paired Sample 't' test was used to compare the differences of pre and post intervention values of Cervical Range of motion, Neck disability Index, Visual analog scale of Group A and Group B. Independent't' test was used to compare the differences between two groups in terms of improvement in cervical range of motion, decrease in visual analogue scale(VAS), decrease in Neck disability index score. Results: Both the groups showed statistically significant improvements in cervical ROM, reduction in pain and in NDI scores. However, mean change in Cervical flexion, cervical extension, right side flexion, left side flexion, right side rotation, left side rotation, pain, neck disability level showed statistically significant improvement (P < 0. 05)) in the patients who received Active Release Technique as compared to Myofascial release technique. Discussion and conclusions: In present study, the average improvement immediately post intervention is significantly greater as compared to before treatment but there is even more improvement after seven sessions as compared to single session. Hence, this proves that several sessions of Manual techniques are necessary to produce clinically relevant results. Active release technique help to reduce the pain threshold by removing adhesion and promote normal tissue extensibility. The act of tensioning and compressing the affected tissue both with digital contact and through the active movement performed by the patient can be a plausible mechanism for tissue healing in this study. This study concluded that both Active Release Technique (ART) and Myofascial release technique (MFR) are equally effective in managing upper trapezius muscle spasm, but more improvement can be achieved by Active Release Technique (ART). Impact and Implications: Active Release Technique can be adopted as mainstay of treatment approach in treating trapezius spasm for faster relief and improving the functional status.

Keywords: trapezius spasm, myofascial release, active release technique, pain

Procedia PDF Downloads 269
6928 Implementation of Tissue Engineering Technique to Nursing of Unhealed Diabetic Foot Lesion

Authors: Basuki Supartono

Abstract:

Introduction: Diabetic wound risks limb amputation, and the healing remains challenging. Chronic Hyperglycemia caused the insufficient inflammatory response and impaired ability of the cells to regenerate. Tissue Engineering Technique is mandatory. Methods: Tissue engineering (TE)-based therapy Utilizing mononuclear cells, plasma rich platelets, and collagen applied on the damaged tissue Results: TE technique resulting in acceptable outcomes. The wound healed completely in 2 months. No adverse effects. No allergic reaction. No morbidity and mortality Discussion: TE-based therapy utilizing mononuclear cells, plasma rich platelets, and collagen are safe and comfortable to fix damaged tissues. These components stop the chronic inflammatory process and increase cells' ability for regeneration and restoration of damaged tissues. Both of these allow the wound to regenerate and heal. Conclusion: TE-based therapy is safe and effectively treats unhealed diabetic lesion.

Keywords: diabetic foot lesion, tissue engineering technique, wound healing, stemcells

Procedia PDF Downloads 75
6927 Development of a Double Coating Technique for Recycled Concrete Aggregates Used in Hot-mix Asphalt

Authors: Abbaas I. Kareem, H. Nikraz

Abstract:

The use of recycled concrete aggregates (RCAs) in hot-mix asphalt (HMA) production could ease natural aggregate shortage and maintain sustainability in modern societies. However, it was the attached cement mortar and other impurities that make the RCAs behave differently than high-quality aggregates. Therefore, different upgrading treatments were suggested to enhance its properties before being used in HMA production. Disappointedly, some of these treatments had caused degradation to some RCA properties. In order to avoid degradation, a coating technique is developed. This technique is based on combining of two main treatments, so it is named as double coating technique (DCT). Dosages of 0%, 20%, 40% and 60% uncoated RCA, RCA coated with Cement Slag Paste (CSP), and Double Coated Recycled Concrete Aggregates (DCRCAs) in place of granite aggregates were evaluated. The results indicated that the DCT improves strength and reduces water absorption of the DCRCAs compared with uncoated RCAs and RCA coated with CSP. In addition, the DCRCA asphalt mixtures exhibit stability values higher than those obtained for mixes made with granite aggregates, uncoated RCAs and RCAs coated with CSP. Also, the DCRCA asphalt mixtures require less bitumen to achieve the optimum bitumen content (OBC) than those manufactured with uncoated RCA and RCA-coated with CSP. Although the results obtained were encouraging, more testing is required in order to examine the effect of the DCT on performance properties of DCRCA- asphalt mixtures such as rutting and fatigue.

Keywords: aggregate crashed value, double coating technique, hot mix asphalt, Marshall parameters, recycled concrete aggregates

Procedia PDF Downloads 283
6926 Sampling and Chemical Characterization of Particulate Matter in a Platinum Mine

Authors: Juergen Orasche, Vesta Kohlmeier, George C. Dragan, Gert Jakobi, Patricia Forbes, Ralf Zimmermann

Abstract:

Underground mining poses a difficult environment for both man and machines. At more than 1000 meters underneath the surface of the earth, ores and other mineral resources are still gained by conventional and motorised mining. Adding to the hazards caused by blasting and stone-chipping, the working conditions are best described by the high temperatures of 35-40°C and high humidity, at low air exchange rates. Separate ventilation shafts lead fresh air into a mine and others lead expended air back to the surface. This is essential for humans and machines working deep underground. Nevertheless, mines are widely ramified. Thus the air flow rate at the far end of a tunnel is sensed to be close to zero. In recent years, conventional mining was supplemented by mining with heavy diesel machines. These very flat machines called Load Haul Dump (LHD) vehicles accelerate and ease work in areas favourable for heavy machines. On the other hand, they emit non-filtered diesel exhaust, which constitutes an occupational hazard for the miners. Combined with a low air exchange, high humidity and inorganic dust from the mining it leads to 'black smog' underneath the earth. This work focuses on the air quality in mines employing LHDs. Therefore we performed personal sampling (samplers worn by miners during their work), stationary sampling and aethalometer (Microaeth MA200, Aethlabs) measurements in a platinum mine in around 1000 meters under the earth’s surface. We compared areas of high diesel exhaust emission with areas of conventional mining where no diesel machines were operated. For a better assessment of health risks caused by air pollution we applied a separated gas-/particle-sampling tool (or system), with first denuder section collecting intermediate VOCs. These multi-channel silicone rubber denuders are able to trap IVOCs while allowing particles ranged from 10 nm to 1 µm in diameter to be transmitted with an efficiency of nearly 100%. The second section is represented by a quartz fibre filter collecting particles and adsorbed semi-volatile organic compounds (SVOC). The third part is a graphitized carbon black adsorber – collecting the SVOCs that evaporate from the filter. The compounds collected on these three sections were analyzed in our labs with different thermal desorption techniques coupled with gas chromatography and mass spectrometry (GC-MS). VOCs and IVOCs were measured with a Shimadzu Thermal Desorption Unit (TD20, Shimadzu, Japan) coupled to a GCMS-System QP 2010 Ultra with a quadrupole mass spectrometer (Shimadzu). The GC was equipped with a 30m, BP-20 wax column (0.25mm ID, 0.25µm film) from SGE (Australia). Filters were analyzed with In-situ derivatization thermal desorption gas chromatography time-of-flight-mass spectrometry (IDTD-GC-TOF-MS). The IDTD unit is a modified GL sciences Optic 3 system (GL Sciences, Netherlands). The results showed black carbon concentrations measured with the portable aethalometers up to several mg per m³. The organic chemistry was dominated by very high concentrations of alkanes. Typical diesel engine exhaust markers like alkylated polycyclic aromatic hydrocarbons were detected as well as typical lubrication oil markers like hopanes.

Keywords: diesel emission, personal sampling, aethalometer, mining

Procedia PDF Downloads 152
6925 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System

Authors: Imran Dayan, Ashiqul Khan

Abstract:

Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.

Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining

Procedia PDF Downloads 327
6924 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm

Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin

Abstract:

Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.

Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform

Procedia PDF Downloads 528
6923 Characterization the Tin Sulfide Thin Films Prepared by Spray Ultrasonic

Authors: A. Attaf A., I. Bouhaf Kharkhachi

Abstract:

Spray ultrasonic deposition technique of tin disulfide (SnS2) thin films know wide application due to their adequate physicochemical properties for microelectronic applications and especially for solar cells. SnS2 film was deposited by spray ultrasonic technique, on pretreated glass substrates at well-determined conditions.The effect of SnS2 concentration on different optical properties of SnS2 Thin films, such us MEB, XRD, and UV spectroscopy visible spectrum was investigated. MEB characterization technique shows that the morphology of this films is uniform, compact and granular. x-ray diffraction study detects the best growth crystallinity in hexagonal structure with preferential plan (001). The results of UV spectroscopy visible spectrum show that films deposited at 0.1 mol/l is large transmittance greater than 25% in the visible region.The band gap energy is 2.54 Ev for molarity 0.1 mol/l.

Keywords: MEB, thin disulfide, thin films, ultrasonic spray, X-Ray diffraction, UV spectroscopy visible

Procedia PDF Downloads 602
6922 Artificial Neural Networks with Decision Trees for Diagnosis Issues

Authors: Y. Kourd, D. Lefebvre, N. Guersi

Abstract:

This paper presents a new idea for fault detection and isolation (FDI) technique which is applied to industrial system. This technique is based on Neural Networks fault-free and Faulty behaviors Models (NNFM's). NNFM's are used for residual generation, while decision tree architecture is used for residual evaluation. The decision tree is realized with data collected from the NNFM’s outputs and is used to isolate detectable faults depending on computed threshold. Each part of the tree corresponds to specific residual. With the decision tree, it becomes possible to take the appropriate decision regarding the actual process behavior by evaluating few numbers of residuals. In comparison to usual systematic evaluation of all residuals, the proposed technique requires less computational effort and can be used for on line diagnosis. An application example is presented to illustrate and confirm the effectiveness and the accuracy of the proposed approach.

Keywords: neural networks, decision trees, diagnosis, behaviors

Procedia PDF Downloads 497
6921 Framework for Integrating Big Data and Thick Data: Understanding Customers Better

Authors: Nikita Valluri, Vatcharaporn Esichaikul

Abstract:

With the popularity of data-driven decision making on the rise, this study focuses on providing an alternative outlook towards the process of decision-making. Combining quantitative and qualitative methods rooted in the social sciences, an integrated framework is presented with a focus on delivering a much more robust and efficient approach towards the concept of data-driven decision-making with respect to not only Big data but also 'Thick data', a new form of qualitative data. In support of this, an example from the retail sector has been illustrated where the framework is put into action to yield insights and leverage business intelligence. An interpretive approach to analyze findings from both kinds of quantitative and qualitative data has been used to glean insights. Using traditional Point-of-sale data as well as an understanding of customer psychographics and preferences, techniques of data mining along with qualitative methods (such as grounded theory, ethnomethodology, etc.) are applied. This study’s final goal is to establish the framework as a basis for providing a holistic solution encompassing both the Big and Thick aspects of any business need. The proposed framework is a modified enhancement in lieu of traditional data-driven decision-making approach, which is mainly dependent on quantitative data for decision-making.

Keywords: big data, customer behavior, customer experience, data mining, qualitative methods, quantitative methods, thick data

Procedia PDF Downloads 158
6920 Exploring Gaming-Learning Interaction in MMOG Using Data Mining Methods

Authors: Meng-Tzu Cheng, Louisa Rosenheck, Chen-Yen Lin, Eric Klopfer

Abstract:

The purpose of the research is to explore some of the ways in which gameplay data can be analyzed to yield results that feedback into the learning ecosystem. Back-end data for all users as they played an MMOG, The Radix Endeavor, was collected, and this study reports the analyses on a specific genetics quest by using the data mining techniques, including the decision tree method. In the study, different reasons for quest failure between participants who eventually succeeded and who never succeeded were revealed. Regarding the in-game tools use, trait examiner was a key tool in the quest completion process. Subsequently, the results of decision tree showed that a lack of trait examiner usage can be made up with additional Punnett square uses, displaying multiple pathways to success in this quest. The methods of analysis used in this study and the resulting usage patterns indicate some useful ways that gameplay data can provide insights in two main areas. The first is for game designers to know how players are interacting with and learning from their game. The second is for players themselves as well as their teachers to get information on how they are progressing through the game, and to provide help they may need based on strategies and misconceptions identified in the data.

Keywords: MMOG, decision tree, genetics, gaming-learning interaction

Procedia PDF Downloads 353
6919 Optimal Sensing Technique for Estimating Stress Distribution of 2-D Steel Frame Structure Using Genetic Algorithm

Authors: Jun Su Park, Byung Kwan Oh, Jin Woo Hwang, Yousok Kim, Hyo Seon Park

Abstract:

For the structural safety, the maximum stress calculated from the stress distribution of a structure is widely used. The stress distribution can be estimated by deformed shape of the structure obtained from measurement. Although the estimation of stress is strongly affected by the location and number of sensing points, most studies have conducted the stress estimation without reasonable basis on sensing plan such as the location and number of sensors. In this paper, an optimal sensing technique for estimating the stress distribution is proposed. This technique proposes the optimal location and number of sensing points for a 2-D frame structure while minimizing the error of stress distribution between analytical model and estimation by cubic smoothing splines using genetic algorithm. To verify the proposed method, the optimal sensor measurement technique is applied to simulation tests on 2-D steel frame structure. The simulation tests are performed under various loading scenarios. Through those tests, the optimal sensing plan for the structure is suggested and verified.

Keywords: genetic algorithm, optimal sensing, optimizing sensor placements, steel frame structure

Procedia PDF Downloads 529
6918 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example

Authors: Wang Yang

Abstract:

Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.

Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map

Procedia PDF Downloads 101
6917 Using Hidden Markov Chain for Improving the Dependability of Safety-Critical Wireless Sensor Networks

Authors: Issam Alnader, Aboubaker Lasebae, Rand Raheem

Abstract:

Wireless sensor networks (WSNs) are distributed network systems used in a wide range of applications, including safety-critical systems. The latter provide critical services, often concerned with human life or assets. Therefore, ensuring the dependability requirements of Safety critical systems is of paramount importance. The purpose of this paper is to utilize the Hidden Markov Model (HMM) to elongate the service availability of WSNs by increasing the time it takes a node to become obsolete via optimal load balancing. We propose an HMM algorithm that, given a WSN, analyses and predicts undesirable situations, notably, nodes dying unexpectedly or prematurely. We apply this technique to improve on C. Lius’ algorithm, a scheduling-based algorithm which has served to improve the lifetime of WSNs. Our experiments show that our HMM technique improves the lifetime of the network, achieved by detecting nodes that die early and rebalancing their load. Our technique can also be used for diagnosis and provide maintenance warnings to WSN system administrators. Finally, our technique can be used to improve algorithms other than C. Liu’s.

Keywords: wireless sensor networks, IoT, dependability of safety WSNs, energy conservation, sleep awake schedule

Procedia PDF Downloads 98
6916 Educational Data Mining: The Case of the Department of Mathematics and Computing in the Period 2009-2018

Authors: Mário Ernesto Sitoe, Orlando Zacarias

Abstract:

University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.

Keywords: evasion and retention, cross-validation, bagging, stacking

Procedia PDF Downloads 78
6915 A Study on the Reinforced Earth Walls Using Sandwich Backfills under Seismic Loads

Authors: Kavitha A.S., L.Govindaraju

Abstract:

Reinforced earth walls offer excellent solution to many problems associated with earth retaining structures especially under seismic conditions. Use of cohesive soils as backfill material reduces the cost of reinforced soil walls if proper drainage measures are taken. This paper presents a numerical study on the application of a new technique called sandwich technique in reinforced earth walls. In this technique, a thin layer of granular soil is placed above and below the reinforcement layer to initiate interface friction and the remaining portion of the backfill is filled up using the existing insitu cohesive soil. A 6 m high reinforced earth wall has been analysed as a two-dimensional plane strain finite element model. Three types of reinforcing elements such as geotextile, geogrid and metallic strips were used. The horizontal wall displacements and the tensile loads in the reinforcement were used as the criteria to evaluate the results at the end of construction and dynamic excitation phases. Also to verify the effectiveness of sandwich layer on the performance of the wall, the thickness of sand fill surrounding the reinforcement was varied. At the end of construction stage it is found that the wall with sandwich type backfill yielded lower displacements when compared to the wall with cohesive soil as backfill. Also with sandwich backfill, the reinforcement loads reduced substantially when compared to the wall with cohesive soil as backfill. Further, it is found that sandwich technique as backfill and geogrid as reinforcement is a good combination to reduce the deformations of geosynthetic reinforced walls during seismic loading.

Keywords: geogrid, geotextile, reinforced earth, sandwich technique

Procedia PDF Downloads 282
6914 Spatial Interpolation Technique for the Optimisation of Geometric Programming Problems

Authors: Debjani Chakraborty, Abhijit Chatterjee, Aishwaryaprajna

Abstract:

Posynomials, a special type of polynomials, having singularities, pose difficulties while solving geometric programming problems. In this paper, a methodology has been proposed and used to obtain extreme values for geometric programming problems by nth degree polynomial interpolation technique. Here the main idea to optimise the posynomial is to fit a best polynomial which has continuous gradient values throughout the range of the function. The approximating polynomial is smoothened to remove the discontinuities present in the feasible region and the objective function. This spatial interpolation method is capable to optimise univariate and multivariate geometric programming problems. An example is solved to explain the robustness of the methodology by considering a bivariate nonlinear geometric programming problem. This method is also applicable for signomial programming problem.

Keywords: geometric programming problem, multivariate optimisation technique, posynomial, spatial interpolation

Procedia PDF Downloads 366
6913 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic

Authors: Budoor Al Abid

Abstract:

Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.

Keywords: machine learning, adaptive, fuzzy logic, data mining

Procedia PDF Downloads 192
6912 Impact of Relaxing Incisions on Maxillofacial Growth Following Sommerlad–Furlow Modified Technique in Patients with Isolated Cleft Palate: A Preliminary Comparative Study

Authors: Sadam Elayah, Yang Li, Bing Shi

Abstract:

Background: The impact of relaxing incisions on maxillofacial growth during palatoplasty remains a topic of debate, and further research is needed to understand its effects fully. Thus, the current study is the first long-term study that aimed to assess the maxillofacial growth of patients with isolated cleft palate following the Sommerlad-Furlow modified (S.F) technique and to estimate the impact of relaxing incisions on maxillofacial growth following S.F technique in patients with isolated cleft palate. Methods: A total of 85 participants, 55 patients with non-syndromic isolated soft and hard cleft palate underwent primary palatoplasty with our technique (30 patients received the Sommerlad-Furlow modified technique without relaxing incision (S.F+RI group), and 25 received Sommerlad-Furlow modified technique without relaxing (S.F-RI group) with no significant difference found between them regarding the cleft type, cleft width, and age at repair. While the other 30 were normal participants with skeletal class I pattern (C group). The control group was matched with the study group in number, age, and sex. All the study variables were measured using stable landmarks, including 12 linear and 10 angular variants. Results: The mean ages at collection of cephalograms were 6.03±0.80 in the S.F+RI group, 5.96±0.76 in the S.F-RI group, and 5.91±0.87 in the C group. Regarding cranial base, the results showed no statistically significant differences between the three groups in S-N and S-N-Ba. The S.F+R.I group had a significantly shorter S-Ba than the S.F-R.I & C groups (P= 0.01). However, there was no statistically significant difference between the S.F-R.I & C groups (P=0.80). Regarding the skeletal maxilla, there was no significant difference between the S.F+R.I and S.F-R.I groups in all linear measurements (N-ANS, S- PM & SN-PP ) except Co-A, the S.F+R.I group had significantly shorter Co-A than the S.F-R.I & C groups (P= <0.01). While the angular measurement, S.F+R.I group had significantly less SNA angle than the S.F-R.I & C groups (P= <0.01). Regarding mandibular bone, there were no statistically significant differences in all linear and angular mandibular measurements between the S.F+R.I and S.F-R.I groups. Regarding intermaxillary relation, the S.F+R.I group had significant differences in Co-Gn - Co-A and ANB compared to the S.F-R.I & C groups (P= <0.01). There was no statistically significant difference in PP-MP among the three groups. Conclusion: As a preliminary report, the Sommerlad-Furlow modified technique without relaxing incisions was found to have good maxillary positioning in the face and a satisfactory intermaxillary relationship compared to the Sommerlad-Furlow modified technique with relaxing incisions.

Keywords: relaxing incisions, cleft palate, palatoplasty, maxillofacial growth

Procedia PDF Downloads 108
6911 Shear Strengthening of Reinforced Concrete Flat Slabs Using Prestressing Bars

Authors: Haifa Saleh, Kamiran Abduka, Robin Kalfat, Riadh Al-Mahaidi

Abstract:

The effectiveness of using pre-stressing steel bars for shear strengthening of high strength reinforced concrete (RC) slabs was assessed. Two large-scale RC slabs were tested, one without shear reinforcement and the second strengthened against punching shear failure using pre-stressing steel bars. The two slabs had the same dimensions, flexural reinforcement ratio, loading and support arrangements. The experimental program including the method of strengthening, set up and instrumentation are described in this paper. The experimental results are analyzed and discussed in terms of the structural behavior of the RC slabs, the performance of pre-stressing steel bolts and failure modes. The results confirmed that the shear strengthening technique increased the shear capacity, ductility and yield capacity of the slab by up to 15%, 44%, and 22%, respectively compared to the unstrengthened slab. The strengthening technique also successfully contributed to changing the failure mode from a brittle punching shear mode to ductile flexural failure mode. Vic3D digital image correlation system (photogrammetry) was also used in this research. This technique holds several advantages over traditional contact instrumentations including that it is inexpensive, it produces results that are simple to analyze and it is remote visualization technique. The displacement profile along the span of the slab and rotation has been found and compared with the results obtained from traditional sensors. The performance of the photogrammetry technique was very good and the results of both measurements were in very close agreement.

Keywords: flat slab, photogrammetry, punching shear, strengthening

Procedia PDF Downloads 159
6910 Eradication of Apple mosaic virus from Corylus avellana L. via Cryotherapy and Confirmation of Virus-Free Plants via Reverse Transcriptase Polymerase Chain Reaction

Authors: Ergun Kaya

Abstract:

Apple mosaic virus (ApMV) is an ilarvirus causing harmful damages and product loses in many plant species. Because of xylem and phloem vessels absence, plant meristem tissues used for meristem cultures are virus-free, but sometimes only meristem cultures are not sufficient for virus elimination. Cryotherapy, a new method based on cryogenic techniques, is used for virus elimination. In this technique, 0.1-0.3mm meristems are excised from organized shoot apex of a selected in vitro donor plant and these meristems are frozen in liquid nitrogen (-196 °C) using suitable cryogenic technique. The aim of this work was to develop an efficient procedure for ApMV-free hazelnut via cryotherapy technique and confirmation of virus-free plants using Reverse Transcriptase-PCR technique. 100% virus free plantlets were obtained using droplet-vitrification method involved cold hardening in vitro cultures of hazelnut, 24 hours sucrose preculture of meristems on MS medium supplemented with 0.4M sucrose, and a 90 min PVS2 treatment in droplets.

Keywords: droplet vitrification, hazelnut, liquid nitrogen, PVS2

Procedia PDF Downloads 156
6909 Use of Locally Effective Microorganisms in Conjunction with Biochar to Remediate Mine-Impacted Soils

Authors: Thomas F. Ducey, Kristin M. Trippe, James A. Ippolito, Jeffrey M. Novak, Mark G. Johnson, Gilbert C. Sigua

Abstract:

The Oronogo-Duenweg mining belt –approximately 20 square miles around the Joplin, Missouri area– is a designated United States Environmental Protection Agency Superfund site due to lead-contaminated soil and groundwater by former mining and smelting operations. Over almost a century of mining (from 1848 to the late 1960’s), an estimated ten million tons of cadmium, lead, and zinc containing material have been deposited on approximately 9,000 acres. Sites that have undergone remediation, in which the O, A, and B horizons have been removed along with the lead contamination, the exposed C horizon remains incalcitrant to revegetation efforts. These sites also suffer from poor soil microbial activity, as measured by soil extracellular enzymatic assays, though 16S ribosomal ribonucleic acid (rRNA) indicates that microbial diversity is equal to sites that have avoided mine-related contamination. Soil analysis reveals low soil organic carbon, along with high levels of bio-available zinc, that reflect the poor soil fertility conditions and low microbial activity. Our study looked at the use of several materials to restore and remediate these sites, with the goal of improving soil health. The following materials, and their purposes for incorporation into the study, were as follows: manure-based biochar for the binding of zinc and other heavy metals responsible for phytotoxicity, locally sourced biosolids and compost to incorporate organic carbon into the depleted soils, effective microorganisms harvested from nearby pristine sites to provide a stable community for nutrient cycling in the newly composited 'soil material'. Our results indicate that all four materials used in conjunction result in the greatest benefit to these mine-impacted soils, based on above ground biomass, microbial biomass, and soil enzymatic activities.

Keywords: locally effective microorganisms, biochar, remediation, reclamation

Procedia PDF Downloads 214
6908 Novel Formal Verification Based Coverage Augmentation Technique

Authors: Surinder Sood, Debajyoti Mukherjee

Abstract:

Formal verification techniques have become widely popular in pre-silicon verification as an alternate to constrain random simulation based techniques. This paper proposed a novel formal verification-based coverage augmentation technique in verifying complex RTL functional verification faster. The proposed approach relies on augmenting coverage analysis coming from simulation and formal verification. Besides this, the functional qualification framework not only helps in improving the coverage at a faster pace but also aids in maturing and qualifying the formal verification infrastructure. The proposed technique has helped to achieve faster verification sign-off, resulting in faster time-to-market. The design picked had a complex control and data path and had many configurable options to meet multiple specification needs. The flow is generic, and tool independent, thereby leveraging across the projects and design will be much easier

Keywords: COI (cone of influence), coverage, formal verification, fault injection

Procedia PDF Downloads 117
6907 Solar Power Generation in a Mining Town: A Case Study for Australia

Authors: Ryan Chalk, G. M. Shafiullah

Abstract:

Climate change is a pertinent issue facing governments and societies around the world. The industrial revolution has resulted in a steady increase in the average global temperature. The mining and energy production industries have been significant contributors to this change prompting government to intervene by promoting low emission technology within these sectors. This paper initially reviews the energy problem in Australia and the mining sector with a focus on the energy requirements and production methods utilised in Western Australia (WA). Renewable energy in the form of utility-scale solar photovoltaics (PV) provides a solution to these problems by providing emission-free energy which can be used to supplement the existing natural gas turbines in operation at the proposed site. This research presents a custom renewable solution for the mining site considering the specific township network, local weather conditions, and seasonal load profiles. A summary of the required PV output is presented to supply slightly over 50% of the towns power requirements during the peak (summer) period, resulting in close to full coverage in the trench (winter) period. Dig Silent Power Factory Software has been used to simulate the characteristics of the existing infrastructure and produces results of integrating PV. Large scale PV penetration in the network introduce technical challenges, that includes; voltage deviation, increased harmonic distortion, increased available fault current and power factor. Results also show that cloud cover has a dramatic and unpredictable effect on the output of a PV system. The preliminary analyses conclude that mitigation strategies are needed to overcome voltage deviations, unacceptable levels of harmonics, excessive fault current and low power factor. Mitigation strategies are proposed to control these issues predominantly through the use of high quality, made for purpose inverters. Results show that use of inverters with harmonic filtering reduces the level of harmonic injections to an acceptable level according to Australian standards. Furthermore, the configuration of inverters to supply active and reactive power assist in mitigating low power factor problems. Use of FACTS devices; SVC and STATCOM also reduces the harmonics and improve the power factor of the network, and finally, energy storage helps to smooth the power supply.

Keywords: climate change, mitigation strategies, photovoltaic (PV), power quality

Procedia PDF Downloads 163
6906 Treatment of Rice Industry Waste Water by Flotation-Flocculation Method

Authors: J. K. Kapoor, Shagufta Jabin, H. S. Bhatia

Abstract:

Polyamine flocculants were synthesized by poly-condensation of diphenylamine and epichlorohydrin using 1, 2-diaminoethane as modifying agent. The polyelectrolytes were prepared by taking epichlohydrin-diphenylamine in a molar ratio of 1:1, 1.5:1, 2:1, and 2.5:1. The flocculation performance of these polyelectrolytes was evaluated with rice industry waste water. The polyelectrolytes have been used in conjunction with alum for coagulation- flocculation process. Prior to the coagulation- flocculation process, air flotation technique was used with the aim to remove oil and grease content from waste water. Significant improvement was observed in the removal of oil and grease content after the air flotation technique. It has been able to remove 91.7% oil and grease from rice industry waste water. After coagulation-flocculation method, it has been observed that polyelectrolyte with epichlohydrin-diphenylamine molar ratio of 1.5:1 showed best results for the removal of pollutants from rice industry waste water. The highest efficiency of turbidity and TSS removal with polyelectrolyte has been found to be 97.5% and 98.2%, respectively. Results of these evaluations also reveal 86.8% removal of COD and 87.5% removal of BOD from rice industry waste water. Thus, we demonstrate optimization of coagulation–flocculation technique which is appropriate for waste water treatment.

Keywords: coagulation, flocculation, air flotation technique, polyelectrolyte, turbidity

Procedia PDF Downloads 473
6905 Discerning Divergent Nodes in Social Networks

Authors: Mehran Asadi, Afrand Agah

Abstract:

In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.

Keywords: online social networks, data mining, social cloud computing, interaction and collaboration

Procedia PDF Downloads 153
6904 A Methodology for Developing New Technology Ideas to Avoid Patent Infringement: F-Term Based Patent Analysis

Authors: Kisik Song, Sungjoo Lee

Abstract:

With the growing importance of intangible assets recently, the impact of patent infringement on the business of a company has become more evident. Accordingly, it is essential for firms to estimate the risk of patent infringement risk before developing a technology and create new technology ideas to avoid the risk. Recognizing the needs, several attempts have been made to help develop new technology opportunities and most of them have focused on identifying emerging vacant technologies from patent analysis. In these studies, the IPC (International Patent Classification) system or keywords from text-mining application to patent documents was generally used to define vacant technologies. Unlike those studies, this study adopted F-term, which classifies patent documents according to the technical features of the inventions described in them. Since the technical features are analyzed by various perspectives by F-term, F-term provides more detailed information about technologies compared to IPC while more systematic information compared to keywords. Therefore, if well utilized, it can be a useful guideline to create a new technology idea. Recognizing the potential of F-term, this paper aims to suggest a novel approach to developing new technology ideas to avoid patent infringement based on F-term. For this purpose, we firstly collected data about F-term and then applied text-mining to the descriptions about classification criteria and attributes. From the text-mining results, we could identify other technologies with similar technical features of the existing one, the patented technology. Finally, we compare the technologies and extract the technical features that are commonly used in other technologies but have not been used in the existing one. These features are presented in terms of “purpose”, “function”, “structure”, “material”, “method”, “processing and operation procedure” and “control means” and so are useful for creating new technology ideas that help avoid infringing patent rights of other companies. Theoretically, this is one of the earliest attempts to adopt F-term to patent analysis; the proposed methodology can show how to best take advantage of F-term with the wealth of technical information. In practice, the proposed methodology can be valuable in the ideation process for successful product and service innovation without infringing the patents of other companies.

Keywords: patent infringement, new technology ideas, patent analysis, F-term

Procedia PDF Downloads 263
6903 Numerical Simulation of Structural Behavior of NSM CFRP Strengthened RC Beams Using Finite Element Analysis

Authors: Faruk Ortes, Baris Sayin, Tarik Serhat Bozkurt, Cemil Akcay

Abstract:

The technique using near-surface mounted (NSM) carbon fiber-reinforced polymer (CFRP) composites has proved to be an reliable strengthening technique. However, the effects of different parameters for the use of NSM CFRP are not fully developed yet. This study focuses on the development of a numerical modeling that can predict the behavior of reinforced concrete (RC) beams strengthened with NSM FRP rods exposed to bending loading and the efficiency of various parameters such as CFRP rod size and filling material type are evaluated by using prepared models. For this purpose, three different models are developed and implemented in the ANSYS® software using Finite Element Analysis (FEA). The numerical results indicate that CFRP rod size and filling material type are significant factors in the behavior of the analyzed RC beams.

Keywords: numerical model, FEA, RC beam, NSM technique, CFRP rod, filling material

Procedia PDF Downloads 595
6902 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques

Authors: Ved Kulkarni, Karthik Kini

Abstract:

This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.

Keywords: data mining, language processing, artificial neural networks, sentiment analysis

Procedia PDF Downloads 11