Search results for: microscopic techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6928

Search results for: microscopic techniques

6418 Combining Nitrocarburisation and Dry Lubrication for Improving Component Lifetime

Authors: Kaushik Vaideeswaran, Jean Gobet, Patrick Margraf, Olha Sereda

Abstract:

Nitrocarburisation is a surface hardening technique often applied to improve the wear resistance of steel surfaces. It is considered to be a promising solution in comparison with other processes such as flame spraying, owing to the formation of a diffusion layer which provides mechanical integrity, as well as its cost-effectiveness. To improve other tribological properties of the surface such as the coefficient of friction (COF), dry lubricants are utilized. Currently, the lifetime of steel components in many applications using either of these techniques individually are faced with the limitations of the two: high COF for nitrocarburized surfaces and low wear resistance of dry lubricant coatings. To this end, the current study involves the creation of a hybrid surface using the impregnation of a dry lubricant on to a nitrocarburized surface. The mechanical strength and hardness of Gerster SA’s nitrocarburized surfaces accompanied by the impregnation of the porous outermost layer with a solid lubricant will create a hybrid surface possessing both outstanding wear resistance and a low friction coefficient and with high adherence to the substrate. Gerster SA has the state-of-the-art technology for the surface hardening of various steels. Through their expertise in the field, the nitrocarburizing process parameters (atmosphere, temperature, dwelling time) were optimized to obtain samples that have a distinct porous structure (in terms of size, shape, and density) as observed by metallographic and microscopic analyses. The porosity thus obtained is suitable for the impregnation of a dry lubricant. A commercially available dry lubricant with a thermoplastic matrix was employed for the impregnation process, which was optimized to obtain a void-free interface with the surface of the nitrocarburized layer (henceforth called hybrid surface). In parallel, metallic samples without nitrocarburisation were also impregnated with the same dry lubricant as a reference (henceforth called reference surface). The reference and the nitrocarburized surfaces, with and without the dry lubricant were tested for their tribological behavior by sliding against a quenched steel ball using a nanotribometer. Without any lubricant, the nitrocarburized surface showed a wear rate 5x lower than the reference metal. In the presence of a thin film of dry lubricant ( < 2 micrometers) and under the application of high loads (500 mN or ~800 MPa), while the COF for the reference surface increased from ~0.1 to > 0.3 within 120 m, the hybrid surface retained a COF < 0.2 for over 400m of sliding. In addition, while the steel ball sliding against the reference surface showed heavy wear, the corresponding ball sliding against the hybrid surface showed very limited wear. Observations of the sliding tracks in the hybrid surface using Electron Microscopy show the presence of the nitrocarburized nodules as well as the lubricant, whereas no traces of the lubricant were found in the sliding track on the reference surface. In this manner, the clear advantage of combining nitrocarburisation with the impregnation of a dry lubricant towards forming a hybrid surface has been demonstrated.

Keywords: dry lubrication, hybrid surfaces, improved wear resistance, nitrocarburisation, steels

Procedia PDF Downloads 113
6417 Kinetic and Mechanistic Study on the Degradation of Typical Pharmaceutical and Personal Care Products in Water by Using Carbon Nanodots/C₃N₄ Composite and Ultrasonic Irradiation

Authors: Miao Yang

Abstract:

PPCPs (pharmaceutical and personal care products) in water, as an environmental pollutant, becomes an issue of increasing concern. Therefore, the techniques for degradation of PPCPs has been a hotspot in water pollution control field. Since there are several disadvantages for common degradation techniques of PPCPs, such as low degradation efficiency for certain PPCPs (ibuprofen and Carbamazepine) this proposal will adopt a combined technique by using CDs (carbon nanodots)/C₃N₄ composite and ultrasonic irradiation to mitigate or overcome these shortages. There is a significant scientific problem that the mechanism including PPCPs, major reactants, and interfacial active sites is not clear yet in the study of PPCPs degradation. This work aims to solve this problem by using both theoretical and experimental methodologies. Firstly, optimized parameters will be obtained by evaluating the kinetics and oxidation efficiency under different conditions. The competition between H₂O₂ and PPCPs with HO• will be elucidated, after which the degradation mechanism of PPCPs by the synergy of CDs/C₃N₄ composite and ultrasonic irradiation will be proposed. Finally, a sonolysis-adsorption-catalysis coupling mechanism will be established which is the theoretical basis and technical support for developing new efficient degradation techniques for PPCPs in the future.

Keywords: carbon nanodots/C₃N₄, pharmaceutical and personal care products, ultrasonic irradiation, hydroxyl radical, heterogeneous catalysis

Procedia PDF Downloads 166
6416 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions

Authors: Hannah F. Opayinka, Adedayo A. Adepoju

Abstract:

This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.

Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution

Procedia PDF Downloads 178
6415 Innovative Acoustic Emission Techniques for Concrete Health Monitoring

Authors: Rahmat Ali, Beenish Khan, Aftabullah, Abid A. Shah

Abstract:

This research is an attempt to investigate the wide range of events using acoustic emission (AE) sensors of the concrete cubes subjected to different stress condition loading and unloading of concrete cubes. A total of 27 specimens were prepared and tested including 18 cubic (6”x6”x6”) and nine cylindrical (4”x8”) specimens were molded from three batches of concrete using w/c of 0.40, 0.50, and 0.60. The compressive strength of concrete was determined from concrete cylinder specimens. The deterioration of concrete was evaluated using the occurrence of felicity and Kaiser effects at each stress condition. It was found that acoustic emission hits usually exceeded when damage increases. Additionally, the correlation between AE techniques and the load applied were determined by plotting the normalized values. The influence of w/c on sensitivity of the AE technique in detecting concrete damages was also investigated.

Keywords: acoustic emission, concrete, felicity ratio, sensors

Procedia PDF Downloads 351
6414 Recommender Systems Using Ensemble Techniques

Authors: Yeonjeong Lee, Kyoung-jae Kim, Youngtae Kim

Abstract:

This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.

Keywords: product recommender system, ensemble technique, association rules, decision tree, artificial neural networks

Procedia PDF Downloads 281
6413 Conducting Glove Leathers Prepared through in-situ Polymerization of Pyrrole

Authors: Wegene Demisie Jima

Abstract:

Leather is a durable and flexible material used for various purposes including clothing, footwear, upholstery and gloves. However, the use of leather for smart product applications is a challenge since it is electrically insulating material. Here, we report a simple method to produce conducting glove leathers using an in-situ polymerization of pyrrole. The concentrations of pyrrole, ferric chloride and anthraquinone-2-sulfonic acid sodium salt monohydrate were optimized to produce maximum conductivity in the treated leathers. The coating of polypyrrole in the treated leathers was probed using FT-IR, X-ray diffraction and electron microscopic analysis. FTIR confirms that the formation of polypyrrole on the leather surface as well as presence of prominent N-C stretching band. X-ray diffraction analysis suggests para-crystallinity in the PPy-treated leathers.We further demonstrate that the treated leathers, with maximum conductivity of 7.4 S/cm, can be used for making conductive gloves for operating touch-screen devices apart from other smart product applications.

Keywords: electrical conductivity, in-situ polymerization, pyrrole, smart product

Procedia PDF Downloads 172
6412 Failure Analysis of Electrode, Nozzle Plate, and Powder Injector during Air Plasma Spray Coating

Authors: Nemes Alexandra

Abstract:

The aim of the research is to develop an optimum microstructure of steel coatings on aluminum surfaces for application on the crankcase cylinder bores. For the proper design of the microstructure of the coat, it is important to control the plasma gun unit properly. The maximum operating time was determined while the plasma gun could optimally work before its destruction. Objectives: The aim of the research is to determine the optimal operating time of the plasma gun between renovations (the renovation shall involve the replacement of the test components of the plasma gun: electrode, nozzle plate, powder injector. Methodology: Plasma jet and particle flux analysis with PFI (PFI is a diagnostic tool for all kinds of thermal spraying processes), CT reconstruction and analysis on the new and the used plasma guns, failure analysis of electrodes, nozzle plates, and powder injectors, microscopic examination of the microstructure of the coating. Contributions: As the result of the failure analysis detailed above, the use of the plasma gun was maximized at 100 operating hours in order to get optimal microstructure for the coat.

Keywords: APS, air plasma spray, failure analysis, electrode, nozzle plate, powder injector

Procedia PDF Downloads 105
6411 An Energy Transfer Fluorescent Probe System for Glucose Sensor at Biomimetic Membrane Surface

Authors: Hoa Thi Hoang, Stephan Sass, Michael U. Kumke

Abstract:

Concanavalin A (conA) is a protein has been widely used in sensor system based on its specific binding to α-D-Glucose or α-D-Manose. For glucose sensor using conA, either fluoresence based techniques with intensity based or lifetime based are used. In this research, liposomes made from phospholipids were used as a biomimetic membrane system. In a first step, novel building blocks containing perylene labeled glucose units were added to the system and used to decorate the surface of the liposomes. Upon the binding between rhodamine labeled con A to the glucose units at the biomimetic membrane surface, a Förster resonance energy transfer system can be formed which combines unique fluorescence properties of perylene (e.g., high fluorescence quantum yield, no triplet formation) and its high hydrophobicity for efficient anchoring in membranes to form a novel probe for the investigation of sugar-driven binding reactions at biomimetic surfaces. Two glucose-labeled perylene derivatives were synthesized with different spacer length between the perylene and glucose unit in order to probe the binding of conA. The binding interaction was fully characterized by using high-end fluorescence techniques. Steady-state and time-resolved fluorescence techniques (e.g., fluorescence depolarization) in combination with single-molecule fluorescence spectroscopy techniques (fluorescence correlation spectroscopy, FCS) were used to monitor the interaction with conA. Base on the fluorescence depolarization, the rotational correlation times and the alteration in the diffusion coefficient (determined by FCS) the binding of the conA to the liposomes carrying the probe was studied. Moreover, single pair FRET experiments using pulsed interleaved excitation are used to characterize in detail the binding of conA to the liposome on a single molecule level avoiding averaging out effects.

Keywords: concanavalin A, FRET, sensor, biomimetic membrane

Procedia PDF Downloads 296
6410 Role of Natural Language Processing in Information Retrieval; Challenges and Opportunities

Authors: Khaled M. Alhawiti

Abstract:

This paper aims to analyze the role of natural language processing (NLP). The paper will discuss the role in the context of automated data retrieval, automated question answer, and text structuring. NLP techniques are gaining wider acceptance in real life applications and industrial concerns. There are various complexities involved in processing the text of natural language that could satisfy the need of decision makers. This paper begins with the description of the qualities of NLP practices. The paper then focuses on the challenges in natural language processing. The paper also discusses major techniques of NLP. The last section describes opportunities and challenges for future research.

Keywords: data retrieval, information retrieval, natural language processing, text structuring

Procedia PDF Downloads 326
6409 Urban Analysis of the Old City of Oran and Its Building after an Earthquake

Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir

Abstract:

The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.

Keywords: earthquake, citadel, performance, traditional techniques, constructions

Procedia PDF Downloads 291
6408 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds

Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi

Abstract:

Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.

Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors

Procedia PDF Downloads 267
6407 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles

Authors: Jafar Mortadha, Imran Qureshi

Abstract:

This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.

Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes

Procedia PDF Downloads 282
6406 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques

Authors: Faisal Alshuwaier, Ali Areshey

Abstract:

Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound method to simplify the texts.

Keywords: extraction, max-prod, fuzzy relations, text mining, memberships, classification, memberships, classification

Procedia PDF Downloads 564
6405 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions

Authors: Marcelo Dias Carvalho, Leticia Ishikawa

Abstract:

Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.

Keywords: just in time, lean manufacturing, supply chain disruptions, supply chain management

Procedia PDF Downloads 328
6404 Automatic Detection of Proliferative Cells in Immunohistochemically Images of Meningioma Using Fuzzy C-Means Clustering and HSV Color Space

Authors: Vahid Anari, Mina Bakhshi

Abstract:

Visual search and identification of immunohistochemically stained tissue of meningioma was performed manually in pathologic laboratories to detect and diagnose the cancers type of meningioma. This task is very tedious and time-consuming. Moreover, because of cell's complex nature, it still remains a challenging task to segment cells from its background and analyze them automatically. In this paper, we develop and test a computerized scheme that can automatically identify cells in microscopic images of meningioma and classify them into positive (proliferative) and negative (normal) cells. Dataset including 150 images are used to test the scheme. The scheme uses Fuzzy C-means algorithm as a color clustering method based on perceptually uniform hue, saturation, value (HSV) color space. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.

Keywords: positive cell, color segmentation, HSV color space, immunohistochemistry, meningioma, thresholding, fuzzy c-means

Procedia PDF Downloads 194
6403 Development of Evolutionary Algorithm by Combining Optimization and Imitation Approach for Machine Learning in Gaming

Authors: Rohit Mittal, Bright Keswani, Amit Mithal

Abstract:

This paper provides a sense about the application of computational intelligence techniques used to develop computer games, especially car racing. For the deep sense and knowledge of artificial intelligence, this paper is divided into various sections that is optimization, imitation, innovation and combining approach of optimization and imitation. This paper is mainly concerned with combining approach which tells different aspects of using fitness measures and supervised learning techniques used to imitate aspects of behavior. The main achievement of this paper is based on modelling player behaviour and evolving new game content such as racing tracks as single car racing on single track.

Keywords: evolution algorithm, genetic, optimization, imitation, racing, innovation, gaming

Procedia PDF Downloads 631
6402 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia

Authors: Carol Anne Hargreaves

Abstract:

A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.

Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system

Procedia PDF Downloads 143
6401 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application

Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam

Abstract:

Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.

Keywords: hydroxyapatite, bone, calcination, biowaste

Procedia PDF Downloads 232
6400 Rail-To-Rail Output Op-Amp Design with Negative Miller Capacitance Compensation

Authors: Muhaned Zaidi, Ian Grout, Abu Khari bin A’ain

Abstract:

In this paper, a two-stage op-amp design is considered using both Miller and negative Miller compensation techniques. The first op-amp design uses Miller compensation around the second amplification stage, whilst the second op-amp design uses negative Miller compensation around the first stage and Miller compensation around the second amplification stage. The aims of this work were to compare the gain and phase margins obtained using the different compensation techniques and identify the ability to choose either compensation technique based on a particular set of design requirements. The two op-amp designs created are based on the same two-stage rail-to-rail output CMOS op-amp architecture where the first stage of the op-amp consists of differential input and cascode circuits, and the second stage is a class AB amplifier. The op-amps have been designed using a 0.35mm CMOS fabrication process.

Keywords: op-amp, rail-to-rail output, Miller compensation, Negative Miller capacitance

Procedia PDF Downloads 326
6399 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile

Procedia PDF Downloads 153
6398 Determination of Complexity Level in Merged Irregular Transposition Cipher

Authors: Okike Benjamin, Garba Ejd

Abstract:

Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.

Keywords: transposition cipher, merged irregular cipher, encryption, complexity level

Procedia PDF Downloads 332
6397 Molecular Basis for Amyloid Inhibition by L-Dopa: Implication towards Systemic Amyloidosis

Authors: Rizwan H. Khan, Saima Nusrat

Abstract:

Despite the fact that amyloid associated neurodegenerative diseases and non-neuropathic systemic amyloidosis have allured the research endeavors, as no curative drugs have been proclaimed up till now except for symptomatic cure. Therapeutic compounds which can diminish or disaggregate such toxic oligomers and fibrillar species have been examined and more are on its way. In the present study, we had reported an extensive biophysical, microscopic and computational study, revealing that L-3, 4-dihydroxyphenylalanine (L-Dopa) possess undeniable potency to inhibit heat induced human lysozyme (HL) amyloid fibrillation and also retain the fibril disaggregating potential. L-Dopa interferes in the amyloid fibrillogenesis process by interacting hydrophobically and also by forming hydrogen bonds with the amino acid residues found in amyloid fibril forming prone region of HL as elucidated by molecular docking results. L-Dopa also disaggregates the mature amyloid fibrils into some unorganised species. Thus, L-Dopa and related compounds can work as a promising inhibitor for the therapeutic advancement prospective against systemic amyloidosis.

Keywords: amyloids, disaggregation, human lysozyme, molecular docking

Procedia PDF Downloads 314
6396 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards

Authors: Golnush Masghati-Amoli, Paul Chin

Abstract:

Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.

Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering

Procedia PDF Downloads 122
6395 A Nanofi Brous PHBV Tube with Schwann Cell as Artificial Nerve Graft Contributing to Rat Sciatic Nerve Regeneration across a 30-Mm Defect Bridge

Authors: Esmaeil Biazar

Abstract:

A nanofibrous PHBV nerve conduit has been used to evaluate its efficiency based on the promotion of nerve regeneration in rats. The designed conduits were investigated by physical, mechanical and microscopic analyses. The conduits were implanted into a 30-mm gap in the sciatic nerves of the rats. Four months after surgery, the regenerated nerves were evaluated by macroscopic assessments and histology. This polymeric conduit had sufficiently high mechanical properties to serve as a nerve guide. The results demonstrated that in the nanofibrous graft with cells, the sciatic nerve trunk had been reconstructed with restoration of nerve continuity and formatted nerve fibers with myelination. For the grafts especially the nanofibrous conduits with cells, muscle cells of gastrocnemius on the operated side were uniform in their size and structures. This study proves the feasibility of artificial conduit with Schwann cells for nerve regeneration by bridging a longer defect in a rat model.

Keywords: sciatic regeneration, Schwann cell, artificial conduit, nanofibrous PHBV, histological assessments

Procedia PDF Downloads 313
6394 The Study of Strength and Weakness Points of Various Techniques for Calculating the Volume of Done Work in Civil Projects

Authors: Ali Fazeli Moslehabadi

Abstract:

One of the topics discussed in civil projects, during the execution of the project, which the continuous change of work volumes is usually the characteristics of these types of projects, is how to calculate the volume of done work. The difference in volumes announced by the execution unit with the estimated volume by the technical office unit, has direct effect on the announced progress of the project. This issue can show the progress of the project more or less than actual value and as a result making mistakes for stakeholders and project managers and misleading them. This article intends to introduce some practical methods for calculating the volume of done work in civil projects. It then reviews the strengths and weaknesses of each of them, in order to resolve these contradictions and conflicts.

Keywords: technical skills, systemic skills, communication skills, done work volume calculation techniques

Procedia PDF Downloads 148
6393 The Influence of Temperature on the Corrosion and Corrosion Inhibition of Steel in Hydrochloric Acid Solution: Thermodynamic Study

Authors: Fatimah Al-Hayazi, Ehteram. A. Noor, Aisha H. Moubaraki

Abstract:

The inhibitive effect of Securigera securidaca seed extract (SSE) on mild steel corrosion in 1 M HCl solution has been studied by weight loss and electrochemical techniques at four different temperatures. All techniques studied provided data that the studied extract does well at all temperatures, and its inhibitory action increases with increasing its concentration. SEM images indicate thin-film formation on mild steel when corroded in solutions containing 1 g L-1 of inhibitor either at low or high temperatures. The polarization studies showed that SSE acts as an anodic inhibitor. Both polarization and impedance techniques show an acceleration behaviour for SSE at concentrations ≤ 0.1 g L-1 at all temperatures. At concentrations ≥ 0.1 g L-1, the efficiency of SSE is dramatically increased with increasing concentration, and its value does not change appreciably with increasing temperature. It was found that all adsorption data obeyed Temkin adsorption isotherm. Kinetic activation and thermodynamic adsorption parameters are evaluated and discussed. The results revealed an endothermic corrosion process with an associative activation mechanism, while a comprehensive adsorption mechanism for SSE on mild steel surfaces is suggested, in which both physical and chemical adsorption are involved in the adsorption process. A good correlation between inhibitor constituents and their inhibitory action was obtained.

Keywords: corrosion, inhibition of steel, hydrochloric acid, thermodynamic study

Procedia PDF Downloads 87
6392 Aristotelian Techniques of Communication Used by Current Affairs Talk Shows in Pakistan for Creating Dramatic Effect to Trigger Emotional Relevance

Authors: Shazia Anwer

Abstract:

The current TV Talk Shows, especially on domestic politics in Pakistan are following the Aristotelian techniques, including deductive reasoning, three modes of persuasion, and guidelines for communication. The application of “Approximate Truth is also seen when Talk Show presenters create doubts against political personalities or national issues. Mainstream media of Pakistan, being a key carrier of narrative construction for the sake of the primary function of national consensus on regional and extended public diplomacy, is failing the purpose. This paper has highlighted the Aristotelian communication methodology, its purposes and its limitations for a serious discussion, and its connection to the mistrust among the Pakistani population regarding fake or embedded, funded Information. Data has been collected from 3 Pakistani TV Talk Shows and their analysis has been made by applying the Aristotelian communication method to highlight the core issues. Paper has also elaborated that current media education is impaired in providing transparent techniques to train the future journalist for a meaningful, thought-provoking discussion. For this reason, this paper has given an overview of HEC’s (Higher Education Commission) graduate-level Mass Com Syllabus for Pakistani Universities. The idea of ethos, logos, and pathos are the main components of TV Talk Shows and as a result, the educated audience is lacking trust in the mainstream media, which eventually generating feelings of distrust and betrayal in the society because productions look like the genre of Drama instead of facts and analysis thus the line between Current Affairs shows and Infotainment has become blurred. In the last section, practical implication to improve meaningfulness and transparency in the TV Talk shows has been suggested by replacing the Aristotelian communication method with the cognitive semiotic communication approach.

Keywords: Aristotelian techniques of communication, current affairs talk shows, drama, Pakistan

Procedia PDF Downloads 192
6391 CAD Tool for Parametric Design modification of Yacht Hull Surface Models

Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart

Abstract:

Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.

Keywords: design parameter, design constraints, shape modifies, yacht hull

Procedia PDF Downloads 289
6390 Acoustic Echo Cancellation Using Different Adaptive Algorithms

Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil

Abstract:

An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.

Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)

Procedia PDF Downloads 67
6389 Flow Visualization in Biological Complex Geometries for Personalized Medicine

Authors: Carlos Escobar-del Pozo, César Ahumada-Monroy, Azael García-Rebolledo, Alberto Brambila-Solórzano, Gregorio Martínez-Sánchez, Luis Ortiz-Rincón

Abstract:

Numerical simulations of flow in complex biological structures have gained considerable attention in the last years. However, the major issue is the validation of the results. The present work shows a Particle Image Velocimetry PIV flow visualization technique in complex biological structures, particularly in intracranial aneurysms. A methodology to reconstruct and generate a transparent model has been developed, as well as visualization and particle tracking techniques. The generated transparent models allow visualizing the flow patterns with a regular camera using the visualization techniques. The final goal is to use visualization as a tool to provide more information on the treatment and surgery decisions in aneurysms.

Keywords: aneurysms, PIV, flow visualization, particle tracking

Procedia PDF Downloads 76