Search results for: geographic techniques
6786 Digital Art Fabric Prints: Procedure, Process and Progress
Authors: Tripti Singh
Abstract:
Digital tools are merging boundaries of different mediums as endeavoured artists exploring new areas. Digital fabric printing has motivated artists to create prints by combining images acquired by photograph, scanned images, computer graphics and microscopic imaginary etc to name few, with traditional media such as hand drawing, weaving, hand printed patterns, printing making techniques and so on. It opened whole new world of possibilities for artists to search, research and combine old and contemporary mediums for their unique art prints. As artistic medium digital art fabrics have aesthetic values which have impact and influence on not only on a personality but also interiors of a living or work space. In this way it can be worn, as fashion statement and also an interior decoration. Digital art fabric prints gives opportunity to print almost everything on any fabric with long lasting prints quality. Single edition and limited editions are possible for maintaining scarcity and uniqueness of an art form. These fabric prints fulfill today’s need, as they are eco-friendly in nature and they produce less wastage compared to traditional fabric printing techniques. These prints can be used to make unique and customized curtains, quilts, clothes, bags, furniture, dolls, pillows, framed artwork, costumes, banners and much, much more. This paper will explore the procedure, process, and progress techniques of digital art fabric printing in depth with suitable pictorial examples.Keywords: digital art, fabric prints, digital fabric prints, new media
Procedia PDF Downloads 5156785 Measuring Housing Quality Using Geographic Information System (GIS)
Authors: Silvija ŠIljeg, Ante ŠIljeg, Ivan Marić
Abstract:
Measuring housing quality is being done on objective and subjective level using different indicators. During the research 5 urban and housing indicators formed according to 58 variables from different housing, domains were used. The aims of the research were to measure housing quality based on GIS approach and to detect critical points of housing in the example of Croatian coastal Town Zadar. The purposes of GIS in the research are to generate models of housing quality indexes by standardisation and aggregation of variables and to examine accuracy model of housing quality index. Analysis of accuracy has been done on the example of variable referring to educational objects availability. By defining weighted coefficients and using different GIS methods high, middle and low housing quality zones were determined. Obtained results can be of use to town planners, spatial planners and town authorities in the process of generating decisions, guidelines, and spatial interventions.Keywords: housing quality, GIS, housing quality index, indicators, models of housing quality
Procedia PDF Downloads 3006784 KAP Study on Breast Cancer Among Women in Nirmala Educational Institutions-A Prospective Observational Study
Authors: Shaik Asha Begum, S. Joshna Rani, Shaik Abdul Rahaman
Abstract:
INTRODUCTION: Breast cancer is a disease that creates in breast cells. "KAP" study estimates the Knowledge, Attitude, and Practices of a local area. More than 1.5 million ladies (25% of all ladies with malignancy) are determined to have bosom disease consistently all through the world. Understanding the degrees of Knowledge, Attitude and Practice will empower a more effective cycle of mindfulness creation as it will permit the program to be custom-made all the more properly to the necessities of the local area. OBJECTIVES: The objective of this study is to assess the knowledge on signs and symptoms, risk factors, provide awareness on the practicing of the early detection techniques of breast cancer and provide knowledge on the overall breast cancer including preventive techniques. METHODOLOGY: This is an expressive cross-sectional investigation. This investigation of KAP was done in the Nirmala Educational Institutions from January to April 2021. A total of 300 participants are included from women students in pharmacy graduates & lecturers, and also from graduates other than the pharmacy. The examiners are taken from the BCAM (Breast Cancer Awareness Measure), tool compartment (Version 2). RESULT: According to the findings of the study, the majority of the participants were not well informed about breast cancer. A lump in the breast was the most commonly mentioned sign of breast cancer, followed by pain in the breast or nipple. The percentage of knowledge related to the breast cancer risk factors was also very less. The correct answers for breast cancer risk factors were radiation exposure (58.20 percent), a positive family history (47.6 percent), obesity (46.9 percent), a lack of physical activity (43.6 percent), and smoking (43.2 percent). Breast cancer screening, on the other hand, was uncommon (only 30 and 11.3 percent practiced clinical breast examination and mammography respectively). CONCLUSION: In this study, the knowledge on the signs and symptoms, risk factors of breast cancer - pharmacy graduates have more knowledge than the non-pharmacy graduates but in the preventive techniques and early detective tools of breast cancer -had poor knowledge in the pharmacy and non-pharmacy graduate. After the awareness program, pharmacy and non-pharmacy graduates got supportive knowledge on the preventive techniques and also practiced the early detective techniques of breast cancer.Keywords: breast cancer, mammography, KAP study, early detection
Procedia PDF Downloads 1386783 Kinetic and Mechanistic Study on the Degradation of Typical Pharmaceutical and Personal Care Products in Water by Using Carbon Nanodots/C₃N₄ Composite and Ultrasonic Irradiation
Authors: Miao Yang
Abstract:
PPCPs (pharmaceutical and personal care products) in water, as an environmental pollutant, becomes an issue of increasing concern. Therefore, the techniques for degradation of PPCPs has been a hotspot in water pollution control field. Since there are several disadvantages for common degradation techniques of PPCPs, such as low degradation efficiency for certain PPCPs (ibuprofen and Carbamazepine) this proposal will adopt a combined technique by using CDs (carbon nanodots)/C₃N₄ composite and ultrasonic irradiation to mitigate or overcome these shortages. There is a significant scientific problem that the mechanism including PPCPs, major reactants, and interfacial active sites is not clear yet in the study of PPCPs degradation. This work aims to solve this problem by using both theoretical and experimental methodologies. Firstly, optimized parameters will be obtained by evaluating the kinetics and oxidation efficiency under different conditions. The competition between H₂O₂ and PPCPs with HO• will be elucidated, after which the degradation mechanism of PPCPs by the synergy of CDs/C₃N₄ composite and ultrasonic irradiation will be proposed. Finally, a sonolysis-adsorption-catalysis coupling mechanism will be established which is the theoretical basis and technical support for developing new efficient degradation techniques for PPCPs in the future.Keywords: carbon nanodots/C₃N₄, pharmaceutical and personal care products, ultrasonic irradiation, hydroxyl radical, heterogeneous catalysis
Procedia PDF Downloads 1806782 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions
Authors: Hannah F. Opayinka, Adedayo A. Adepoju
Abstract:
This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution
Procedia PDF Downloads 1866781 Integration of a Self-Cooling Photobioreactor to Building Envelop
Authors: Amin Mirabbasi
Abstract:
This review focuses on the integration of self-cooling photobioreactors into building envelopes as an approach to sustainable architecture. We emphasize the urgency for eco-friendly design advancements and explore the incorporation of plants, particularly microalgae photobioreactors, into building facades. This entails a discussion of the building envelope's components and definition, challenges posed by algal technology in architecture, and adaptations for varied structures such as skyscrapers, residences, and townhouses. We further evaluate the influence of geographic factors, with a spotlight on warm and temperate regions like Western Australia. Concluding, we analyse the cost-effectiveness and practicality of this integration, focusing on its potential application in the upcoming Harry Butler Science Centre building. Through comprehensive literature scrutiny, we aim to shed light on the prospects and obstacles of embedding self-cooling photobioreactors in pursuit of an eco-aware architectural future.Keywords: microalgae photobioreactors, building envelope, sustainable architecture, eco-friendly design advancements.
Procedia PDF Downloads 646780 Innovative Acoustic Emission Techniques for Concrete Health Monitoring
Authors: Rahmat Ali, Beenish Khan, Aftabullah, Abid A. Shah
Abstract:
This research is an attempt to investigate the wide range of events using acoustic emission (AE) sensors of the concrete cubes subjected to different stress condition loading and unloading of concrete cubes. A total of 27 specimens were prepared and tested including 18 cubic (6”x6”x6”) and nine cylindrical (4”x8”) specimens were molded from three batches of concrete using w/c of 0.40, 0.50, and 0.60. The compressive strength of concrete was determined from concrete cylinder specimens. The deterioration of concrete was evaluated using the occurrence of felicity and Kaiser effects at each stress condition. It was found that acoustic emission hits usually exceeded when damage increases. Additionally, the correlation between AE techniques and the load applied were determined by plotting the normalized values. The influence of w/c on sensitivity of the AE technique in detecting concrete damages was also investigated.Keywords: acoustic emission, concrete, felicity ratio, sensors
Procedia PDF Downloads 3626779 Recommender Systems Using Ensemble Techniques
Authors: Yeonjeong Lee, Kyoung-jae Kim, Youngtae Kim
Abstract:
This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.Keywords: product recommender system, ensemble technique, association rules, decision tree, artificial neural networks
Procedia PDF Downloads 2946778 An Energy Transfer Fluorescent Probe System for Glucose Sensor at Biomimetic Membrane Surface
Authors: Hoa Thi Hoang, Stephan Sass, Michael U. Kumke
Abstract:
Concanavalin A (conA) is a protein has been widely used in sensor system based on its specific binding to α-D-Glucose or α-D-Manose. For glucose sensor using conA, either fluoresence based techniques with intensity based or lifetime based are used. In this research, liposomes made from phospholipids were used as a biomimetic membrane system. In a first step, novel building blocks containing perylene labeled glucose units were added to the system and used to decorate the surface of the liposomes. Upon the binding between rhodamine labeled con A to the glucose units at the biomimetic membrane surface, a Förster resonance energy transfer system can be formed which combines unique fluorescence properties of perylene (e.g., high fluorescence quantum yield, no triplet formation) and its high hydrophobicity for efficient anchoring in membranes to form a novel probe for the investigation of sugar-driven binding reactions at biomimetic surfaces. Two glucose-labeled perylene derivatives were synthesized with different spacer length between the perylene and glucose unit in order to probe the binding of conA. The binding interaction was fully characterized by using high-end fluorescence techniques. Steady-state and time-resolved fluorescence techniques (e.g., fluorescence depolarization) in combination with single-molecule fluorescence spectroscopy techniques (fluorescence correlation spectroscopy, FCS) were used to monitor the interaction with conA. Base on the fluorescence depolarization, the rotational correlation times and the alteration in the diffusion coefficient (determined by FCS) the binding of the conA to the liposomes carrying the probe was studied. Moreover, single pair FRET experiments using pulsed interleaved excitation are used to characterize in detail the binding of conA to the liposome on a single molecule level avoiding averaging out effects.Keywords: concanavalin A, FRET, sensor, biomimetic membrane
Procedia PDF Downloads 3076777 Role of Natural Language Processing in Information Retrieval; Challenges and Opportunities
Authors: Khaled M. Alhawiti
Abstract:
This paper aims to analyze the role of natural language processing (NLP). The paper will discuss the role in the context of automated data retrieval, automated question answer, and text structuring. NLP techniques are gaining wider acceptance in real life applications and industrial concerns. There are various complexities involved in processing the text of natural language that could satisfy the need of decision makers. This paper begins with the description of the qualities of NLP practices. The paper then focuses on the challenges in natural language processing. The paper also discusses major techniques of NLP. The last section describes opportunities and challenges for future research.Keywords: data retrieval, information retrieval, natural language processing, text structuring
Procedia PDF Downloads 3416776 Urban Analysis of the Old City of Oran and Its Building after an Earthquake
Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir
Abstract:
The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.Keywords: earthquake, citadel, performance, traditional techniques, constructions
Procedia PDF Downloads 3056775 Molecular Epidemiologic Distribution of HDV Genotypes among Different Ethnic Groups in Iran: A Systematic Review
Authors: Khabat Barkhordari
Abstract:
Hepatitis delta virus (HDV) is a RNA virus that needs the function of hepatitis B virus (HBV) for its propagation and assembly. Infection by HDV can occur spontaneously with HBV infection and cause acute hepatitis or develop as secondary infection in HBV suffering patients. Based on genome sequence analysis, HDV has several genotypes which show broad geographic and diverse clinical features. The aim of current study is determine the molecular epidemiology of hepatitis delta virus genotype in patients with positive HBsAg among different ethnic groups of Iran. This systematic review study reviews the results of different studies which examined 2000 Iranian patients with HBV infection from 2010 to 2015. Among 2000 patients in this study, 16.75 % were containing anti-HDV antibody and HDV RNA was found in just 1.75% cases. All of positive cases also have genotype I.Keywords: HDV, genotype, epidemiology, distribution
Procedia PDF Downloads 2756774 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds
Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi
Abstract:
Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors
Procedia PDF Downloads 2736773 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles
Authors: Jafar Mortadha, Imran Qureshi
Abstract:
This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes
Procedia PDF Downloads 2956772 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques
Authors: Faisal Alshuwaier, Ali Areshey
Abstract:
Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound method to simplify the texts.Keywords: extraction, max-prod, fuzzy relations, text mining, memberships, classification, memberships, classification
Procedia PDF Downloads 5826771 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions
Authors: Marcelo Dias Carvalho, Leticia Ishikawa
Abstract:
Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.Keywords: just in time, lean manufacturing, supply chain disruptions, supply chain management
Procedia PDF Downloads 3386770 Development of Evolutionary Algorithm by Combining Optimization and Imitation Approach for Machine Learning in Gaming
Authors: Rohit Mittal, Bright Keswani, Amit Mithal
Abstract:
This paper provides a sense about the application of computational intelligence techniques used to develop computer games, especially car racing. For the deep sense and knowledge of artificial intelligence, this paper is divided into various sections that is optimization, imitation, innovation and combining approach of optimization and imitation. This paper is mainly concerned with combining approach which tells different aspects of using fitness measures and supervised learning techniques used to imitate aspects of behavior. The main achievement of this paper is based on modelling player behaviour and evolving new game content such as racing tracks as single car racing on single track.Keywords: evolution algorithm, genetic, optimization, imitation, racing, innovation, gaming
Procedia PDF Downloads 6466769 Metabolomics Fingerprinting Analysis of Melastoma malabathricum L. Leaf of Geographical Variation Using HPLC-DAD Combined with Chemometric Tools
Authors: Dian Mayasari, Yosi Bayu Murti, Sylvia Utami Tunjung Pratiwi, Sudarsono
Abstract:
Melastoma malabathricum L. is an Indo-Pacific herb that has been traditionally used to treat several ailments such as wounds, dysentery, diarrhea, toothache, and diabetes. This plant is common across tropical Indo-Pacific archipelagos and is tolerant of a range of soils, from low-lying areas subject to saltwater inundation to the salt-free conditions of mountain slopes. How the soil and environmental variation influences secondary metabolite production in the herb, and an understanding of the plant’s utility as traditional medicine, remain largely unknown and unexplored. The objective of this study is to evaluate the variability of the metabolic profiles of M. malabathricum L. across its geographic distribution. By employing high-performance liquid chromatography-diode array detector (HPLC-DAD), a highly established, simple, sensitive, and reliable method was employed for establishing the chemical fingerprints of 72 samples of M. malabathricum L. leaves from various geographical locations in Indonesia. Specimens collected from six terrestrial and archipelago regions of Indonesia were analyzed by HPLC to generate chromatogram peak profiles that could be compared across each region. Data corresponding to the common peak areas of HPLC chromatographic fingerprint were analyzed by hierarchical component analysis (HCA) and principal component analysis (PCA) to extract information on the most significant variables contributing to characterization and classification of analyzed samples data. Principal component values were identified as PC1 and PC2 with 41.14% and 19.32%, respectively. Based on variety and origin, the high-performance liquid chromatography method validated the chemical fingerprint results used to screen the in vitro antioxidant activity of M. malabathricum L. The result shows that the developed method has potential values for the quality of similar M. malabathrium L. samples. These findings provide a pathway for the development and utilization of references for the identification of M. malabathricum L. Our results indicate the importance of considering geographic distribution during field-collection efforts as they demonstrate regional metabolic variation in secondary metabolites of M. malabathricum L., as illustrated by HPLC chromatogram peaks and their antioxidant activities. The results also confirm the utility of this simple approach to a rapid evaluation of metabolic variation between plants and their potential ethnobotanical properties, potentially due to the environments from whence they were collected. This information will facilitate the optimization of growth conditions to suit particular medicinal qualities.Keywords: fingerprint, high performance liquid chromatography, Melastoma malabathricum l., metabolic profiles, principal component analysis
Procedia PDF Downloads 1626768 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system
Procedia PDF Downloads 1576767 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application
Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam
Abstract:
Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.Keywords: hydroxyapatite, bone, calcination, biowaste
Procedia PDF Downloads 2496766 Rail-To-Rail Output Op-Amp Design with Negative Miller Capacitance Compensation
Authors: Muhaned Zaidi, Ian Grout, Abu Khari bin A’ain
Abstract:
In this paper, a two-stage op-amp design is considered using both Miller and negative Miller compensation techniques. The first op-amp design uses Miller compensation around the second amplification stage, whilst the second op-amp design uses negative Miller compensation around the first stage and Miller compensation around the second amplification stage. The aims of this work were to compare the gain and phase margins obtained using the different compensation techniques and identify the ability to choose either compensation technique based on a particular set of design requirements. The two op-amp designs created are based on the same two-stage rail-to-rail output CMOS op-amp architecture where the first stage of the op-amp consists of differential input and cascode circuits, and the second stage is a class AB amplifier. The op-amps have been designed using a 0.35mm CMOS fabrication process.Keywords: op-amp, rail-to-rail output, Miller compensation, Negative Miller capacitance
Procedia PDF Downloads 3386765 Image Processing techniques for Surveillance in Outdoor Environment
Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.
Abstract:
This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management
Procedia PDF Downloads 266764 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 1696763 Determination of Complexity Level in Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 3446762 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 1346761 The Study of Strength and Weakness Points of Various Techniques for Calculating the Volume of Done Work in Civil Projects
Authors: Ali Fazeli Moslehabadi
Abstract:
One of the topics discussed in civil projects, during the execution of the project, which the continuous change of work volumes is usually the characteristics of these types of projects, is how to calculate the volume of done work. The difference in volumes announced by the execution unit with the estimated volume by the technical office unit, has direct effect on the announced progress of the project. This issue can show the progress of the project more or less than actual value and as a result making mistakes for stakeholders and project managers and misleading them. This article intends to introduce some practical methods for calculating the volume of done work in civil projects. It then reviews the strengths and weaknesses of each of them, in order to resolve these contradictions and conflicts.Keywords: technical skills, systemic skills, communication skills, done work volume calculation techniques
Procedia PDF Downloads 1576760 The Influence of Temperature on the Corrosion and Corrosion Inhibition of Steel in Hydrochloric Acid Solution: Thermodynamic Study
Authors: Fatimah Al-Hayazi, Ehteram. A. Noor, Aisha H. Moubaraki
Abstract:
The inhibitive effect of Securigera securidaca seed extract (SSE) on mild steel corrosion in 1 M HCl solution has been studied by weight loss and electrochemical techniques at four different temperatures. All techniques studied provided data that the studied extract does well at all temperatures, and its inhibitory action increases with increasing its concentration. SEM images indicate thin-film formation on mild steel when corroded in solutions containing 1 g L-1 of inhibitor either at low or high temperatures. The polarization studies showed that SSE acts as an anodic inhibitor. Both polarization and impedance techniques show an acceleration behaviour for SSE at concentrations ≤ 0.1 g L-1 at all temperatures. At concentrations ≥ 0.1 g L-1, the efficiency of SSE is dramatically increased with increasing concentration, and its value does not change appreciably with increasing temperature. It was found that all adsorption data obeyed Temkin adsorption isotherm. Kinetic activation and thermodynamic adsorption parameters are evaluated and discussed. The results revealed an endothermic corrosion process with an associative activation mechanism, while a comprehensive adsorption mechanism for SSE on mild steel surfaces is suggested, in which both physical and chemical adsorption are involved in the adsorption process. A good correlation between inhibitor constituents and their inhibitory action was obtained.Keywords: corrosion, inhibition of steel, hydrochloric acid, thermodynamic study
Procedia PDF Downloads 1006759 Aristotelian Techniques of Communication Used by Current Affairs Talk Shows in Pakistan for Creating Dramatic Effect to Trigger Emotional Relevance
Authors: Shazia Anwer
Abstract:
The current TV Talk Shows, especially on domestic politics in Pakistan are following the Aristotelian techniques, including deductive reasoning, three modes of persuasion, and guidelines for communication. The application of “Approximate Truth is also seen when Talk Show presenters create doubts against political personalities or national issues. Mainstream media of Pakistan, being a key carrier of narrative construction for the sake of the primary function of national consensus on regional and extended public diplomacy, is failing the purpose. This paper has highlighted the Aristotelian communication methodology, its purposes and its limitations for a serious discussion, and its connection to the mistrust among the Pakistani population regarding fake or embedded, funded Information. Data has been collected from 3 Pakistani TV Talk Shows and their analysis has been made by applying the Aristotelian communication method to highlight the core issues. Paper has also elaborated that current media education is impaired in providing transparent techniques to train the future journalist for a meaningful, thought-provoking discussion. For this reason, this paper has given an overview of HEC’s (Higher Education Commission) graduate-level Mass Com Syllabus for Pakistani Universities. The idea of ethos, logos, and pathos are the main components of TV Talk Shows and as a result, the educated audience is lacking trust in the mainstream media, which eventually generating feelings of distrust and betrayal in the society because productions look like the genre of Drama instead of facts and analysis thus the line between Current Affairs shows and Infotainment has become blurred. In the last section, practical implication to improve meaningfulness and transparency in the TV Talk shows has been suggested by replacing the Aristotelian communication method with the cognitive semiotic communication approach.Keywords: Aristotelian techniques of communication, current affairs talk shows, drama, Pakistan
Procedia PDF Downloads 2046758 CAD Tool for Parametric Design modification of Yacht Hull Surface Models
Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart
Abstract:
Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.Keywords: design parameter, design constraints, shape modifies, yacht hull
Procedia PDF Downloads 3016757 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 80