Search results for: craft technique
5470 Implementation of a Monostatic Microwave Imaging System using a UWB Vivaldi Antenna
Authors: Babatunde Olatujoye, Binbin Yang
Abstract:
Microwave imaging is a portable, noninvasive, and non-ionizing imaging technique that employs low-power microwave signals to reveal objects in the microwave frequency range. This technique has immense potential for adoption in commercial and scientific applications such as security scanning, material characterization, and nondestructive testing. This work presents a monostatic microwave imaging setup using an Ultra-Wideband (UWB), low-cost, miniaturized Vivaldi antenna with a bandwidth of 1 – 6 GHz. The backscattered signals (S-parameters) of the Vivaldi antenna used for scanning targets were measured in the lab using a VNA. An automated two-dimensional (2-D) scanner was employed for the 2-D movement of the transceiver to collect the measured scattering data from different positions. The targets consist of four metallic objects, each with a distinct shape. Similar setup was also simulated in Ansys HFSS. A high-resolution Back Propagation Algorithm (BPA) was applied to both the simulated and experimental backscattered signals. The BPA utilizes the phase and amplitude information recorded over a two-dimensional aperture of 50 cm × 50 cm with a discreet step size of 2 cm to reconstruct a focused image of the targets. The adoption of BPA was demonstrated by coherently resolving and reconstructing reflection signals from conventional time-of-flight profiles. For both the simulation and experimental data, BPA accurately reconstructed a high resolution 2D image of the targets in terms of shape and location. An improvement of the BPA, in terms of target resolution, was achieved by applying the filtering method in frequency domain.Keywords: back propagation, microwave imaging, monostatic, vivialdi antenna, ultra wideband
Procedia PDF Downloads 175469 Surface Modified Quantum Dots for Nanophotonics, Stereolithography and Hybrid Systems for Biomedical Studies
Authors: Redouane Krini, Lutz Nuhn, Hicham El Mard Cheol Woo Ha, Yoondeok Han, Kwang-Sup Lee, Dong-Yol Yang, Jinsoo Joo, Rudolf Zentel
Abstract:
To use Quantum Dots (QDs) in the two photon initiated polymerization technique (TPIP) for 3D patternings, QDs were modified on the surface with photosensitive end groups which are able to undergo a photopolymerization. We were able to fabricate fluorescent 3D lattice structures using photopatternable QDs by TPIP for photonic devices such as photonic crystals and metamaterials. The QDs in different diameter have different emission colors and through mixing of RGB QDs white light fluorescent from the polymeric structures has been created. Metamaterials are capable for unique interaction with the electrical and magnetic components of the electromagnetic radiation and for manipulating light it is crucial to have a negative refractive index. In combination with QDs via TPIP technique polymeric structures can be designed with properties which cannot be found in nature. This makes these artificial materials gaining a huge importance for real-life applications in photonic and optoelectronic. Understanding of interactions between nanoparticles and biological systems is of a huge interest in the biomedical research field. We developed a synthetic strategy of polymer functionalized nanoparticles for biomedical studies to obtain hybrid systems of QDs and copolymers with a strong binding network in an inner shell and which can be modified in the end through their poly(ethylene glycol) functionalized outer shell. These hybrid systems can be used as models for investigation of cell penetration and drug delivery by using measurements combination between CryoTEM and fluorescence studies.Keywords: biomedical study models, lithography, photo induced polymerization, quantum dots
Procedia PDF Downloads 5245468 The Effect of Micro/Nano Structure of Poly (ε-caprolactone) (PCL) Film Using a Two-Step Process (Casting/Plasma) on Cellular Responses
Authors: JaeYoon Lee, Gi-Hoon Yang, JongHan Ha, MyungGu Yeo, SeungHyun Ahn, Hyeongjin Lee, HoJun Jeon, YongBok Kim, Minseong Kim, GeunHyung Kim
Abstract:
One of the important factors in tissue engineering is to design optimal biomedical scaffolds, which can be governed by topographical surface characteristics, such as size, shape, and direction. Of these properties, we focused on the effects of nano- to micro-sized hierarchical surface. To fabricate the hierarchical surface structure on poly(ε-caprolactone) (PCL) film, we employed a micro-casting technique by pressing the mold and nano-etching technique using a modified plasma process. The micro-sized topography of PCL film was controlled by sizes of the micro structures on lotus leaf. Also, the nano-sized topography and hydrophilicity of PCL film were controlled by a modified plasma process. After the plasma treatment, the hydrophobic property of the PCL film was significantly changed into hydrophilic property, and the nano-sized structure was well developed. The surface properties of the modified PCL film were investigated in terms of initial cell morphology, attachment, and proliferation using osteoblast-like-cells (MG63). In particular, initial cell attachment, proliferation and osteogenic differentiation in the hierarchical structure were enhanced dramatically compared to those of the smooth surface. We believe that these results are because of a synergistic effect between the hierarchical structure and the reactive functional groups due to the plasma process. Based on the results presented here, we propose a new biomimetic surface model that maybe useful for effectively regenerating hard tissues.Keywords: hierarchical surface, lotus leaf, nano-etching, plasma treatment
Procedia PDF Downloads 3735467 Genomic Identification of Anisakis Simplex Larvae by PCR-RAPD
Authors: Fumiko Kojima, Shuji Fujimoto
Abstract:
Anisakiasis is a disease caused by infection with an anisakid larvae, mostly Anisakis simplex. The larvae commonly infect in marine fish and the disease is frequently reported in areas of the world where fish is consumed raw, lightly pickled or salted. In Japan, people have the habit of eating raw fish such as ‘sushi’ or ‘sashimi’, so they have more chance of infection with larvae of anisakid nematodes. There are three sibling species in A. simplex larvae, namely, A. simplex sensu stricto (Asss), A. pegreffii (Ap) and A. simplex C. It was revealed that Ap is dominant among the larvae from fish (Scomber japonics) in the Japan Sea side and Asss is dominant among those of the Pacific Ocean side conversely. Although anisakiasis has happened in Japan among both the Japan Sea side area and the Pacific Ocean side area. The aim of this study was to investigate genetic variations between the siblings (Asss and Ap) and within the same sibling species by random amplified polymorphic DNA (RAPD) technique. In order to investigate the genetic difference among the each A. simplex larvae, we used RAPD technique to differentiate individuals of A. simplex obtained from Scomber japonics fish those were caught in the Japan sea (Goto Islands in Nagasaki Prefecture) and the cost of Pacific Ocean (Kanagawa Prefecture). The RAPD patterns of the control DNA (Genus Raphidascaris) were markedly different from those of the A. simplex. There were differences in amplification patterns between Asss and Ap. The RAPD patterns for larvae obtained from fish of the same sea were somewhat different and variations were detected even among larvae from the same fish. These results suggest the considerable high genetic variability between Asss and Ap and the possible existence of genetic variation within the sibling species.Keywords: Anisakiasis in Japan, Anisakis simplex, genomic identification, PCR-RAPD
Procedia PDF Downloads 1805466 Axillary Evaluation with Targeted Axillary Dissection Using Ultrasound-Visible Clips after Neoadjuvant Chemotherapy for Patients with Node-Positive Breast Cancer
Authors: Naomi Sakamoto, Eisuke Fukuma, Mika Nashimoto, Yoshitomo Koshida
Abstract:
Background: Selective localization of the metastatic lymph node with clip and removal of clipped nodes with sentinel lymph node (SLN), known as targeted axillary dissection (TAD), reduced false-negative rates (FNR) of SLN biopsy (SLNB) after neoadjuvant chemotherapy (NAC). For the patients who achieved nodal pathologic complete response (pCR), accurate staging of axilla by TAD lead to omit axillary lymph node dissection (ALND), decreasing postoperative arm morbidity without a negative effect on overall survival. This study aimed to investigate the ultrasound (US) identification rate and success removal rate of two kinds of ultrasound-visible clips placed in metastatic lymph nodes during TAD procedure. Methods: This prospective study was conducted using patients with clinically T1-3, N1, 2, M0 breast cancer undergoing NAC followed by surgery. A US-visible clip was placed in the suspicious lymph node under US guidance before neoadjuvant chemotherapy. Before surgery, US examination was performed to evaluate the detection rate of clipped node. During the surgery, the clipped node was removed using several localization techniques, including hook-wire localization, dye-injection, or fluorescence technique, followed by a dual-technique SLNB and resection of palpable nodes if present. For the fluorescence technique, after injection of 0.1-0.2 mL of indocyanine green dye (ICG) into the clipped node, ICG fluorescent imaging was performed using the Photodynamic Eye infrared camera (Hamamatsu Photonics k. k., Shizuoka, Japan). For the dye injection method, 0.1-0.2 mL of pyoktanin blue dye was injected into the clipped node. Results: A total of 29 patients were enrolled. Hydromark™ breast biopsy site markers (Hydromark, T3 shape; Devicor Medical Japan, Tokyo, Japan) was used in 15patients, whereas a UltraCor™ Twirl™ breast marker (Twirl; C.R. Bard, Inc, NJ, USA) was placed in 14 patients. US identified the clipped node marked with the UltraCore Twirl in 100% (14/14) and with the Hydromark in 93.3% (14/15, p = ns). Success removal of clipped node marked with the UltraCore Twirl was achieved in 100% (14/14), whereas the node marked with the Hydromark was removed in 80% (12/15) (p = ns). Conclusions: The ultrasound identification rate differed between the two types of ultrasound-visible clips, which also affected the success removal rate of clipped nodes. Labelling the positive node with a US-highly-visible clip allowed successful TAD.Keywords: breast cancer, neoadjuvant chemotherapy, targeted axillary dissection, breast tissue marker, clip
Procedia PDF Downloads 655465 Finite Element Analysis of Human Tarsals, Meta Tarsals and Phalanges for Predicting probable location of Fractures
Authors: Irfan Anjum Manarvi, Fawzi Aljassir
Abstract:
Human bones have been a keen area of research over a long time in the field of biomechanical engineering. Medical professionals, as well as engineering academics and researchers, have investigated various bones by using medical, mechanical, and materials approaches to discover the available body of knowledge. Their major focus has been to establish properties of these and ultimately develop processes and tools either to prevent fracture or recover its damage. Literature shows that mechanical professionals conducted a variety of tests for hardness, deformation, and strain field measurement to arrive at their findings. However, they considered these results accuracy to be insufficient due to various limitations of tools, test equipment, difficulties in the availability of human bones. They proposed the need for further studies to first overcome inaccuracies in measurement methods, testing machines, and experimental errors and then carry out experimental or theoretical studies. Finite Element analysis is a technique which was developed for the aerospace industry due to the complexity of design and materials. But over a period of time, it has found its applications in many other industries due to accuracy and flexibility in selection of materials and types of loading that could be theoretically applied to an object under study. In the past few decades, the field of biomechanical engineering has also started to see its applicability. However, the work done in the area of Tarsals, metatarsals and phalanges using this technique is very limited. Therefore, present research has been focused on using this technique for analysis of these critical bones of the human body. This technique requires a 3-dimensional geometric computer model of the object to be analyzed. In the present research, a 3d laser scanner was used for accurate geometric scans of individual tarsals, metatarsals, and phalanges from a typical human foot to make these computer geometric models. These were then imported into a Finite Element Analysis software and a length refining process was carried out prior to analysis to ensure the computer models were true representatives of actual bone. This was followed by analysis of each bone individually. A number of constraints and load conditions were applied to observe the stress and strain distributions in these bones under the conditions of compression and tensile loads or their combination. Results were collected for deformations in various axis, and stress and strain distributions were observed to identify critical locations where fracture could occur. A comparative analysis of failure properties of all the three types of bones was carried out to establish which of these could fail earlier which is presented in this research. Results of this investigation could be used for further experimental studies by the academics and researchers, as well as industrial engineers, for development of various foot protection devices or tools for surgical operations and recovery treatment of these bones. Researchers could build up on these models to carryout analysis of a complete human foot through Finite Element analysis under various loading conditions such as walking, marching, running, and landing after a jump etc.Keywords: tarsals, metatarsals, phalanges, 3D scanning, finite element analysis
Procedia PDF Downloads 3265464 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects
Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang
Abstract:
As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.Keywords: 4D, 5D, 6D, active BIM
Procedia PDF Downloads 2755463 Estimation of the Mean of the Selected Population
Authors: Kalu Ram Meena, Aditi Kar Gangopadhyay, Satrajit Mandal
Abstract:
Two normal populations with different means and same variance are considered, where the variances are known. The population with the smaller sample mean is selected. Various estimators are constructed for the mean of the selected normal population. Finally, they are compared with respect to the bias and MSE risks by the method of Monte-Carlo simulation and their performances are analysed with the help of graphs.Keywords: estimation after selection, Brewster-Zidek technique, estimators, selected populations
Procedia PDF Downloads 5105462 Optimization Technique for the Contractor’s Portfolio in the Bidding Process
Authors: Taha Anjamrooz, Sareh Rajabi, Salwa Bheiry
Abstract:
Selection between the available projects in bidding processes for the contractor is one of the essential areas to concentrate on. It is important for the contractor to choose the right projects within its portfolio during the tendering stage based on certain criteria. It should align the bidding process with its origination strategies and goals as a screening process to have the right portfolio pool to start with. Secondly, it should set the proper framework and use a suitable technique in order to optimize its selection process for concertation purpose and higher efforts during the tender stage with goals of success and winning. In this research paper, a two steps framework proposed to increase the efficiency of the contractor’s bidding process and the winning chance of getting the new projects awarded. In this framework, initially, all the projects pass through the first stage screening process, in which the portfolio basket will be evaluated and adjusted in accordance with the organization strategies to the reduced version of the portfolio pool, which is in line with organization activities. In the second stage, the contractor uses linear programming to optimize the portfolio pool based on available resources such as manpower, light equipment, heavy equipment, financial capability, return on investment, and success rate of winning the bid. Therefore, this optimization model will assist the contractor in utilizing its internal resource to its maximum and increase its winning chance for the new project considering past experience with clients, built-relation between two parties, and complexity in the exertion of the projects. The objective of this research will be to increase the contractor's winning chance in the bidding process based on the success rate and expected return on investment.Keywords: bidding process, internal resources, optimization, contracting portfolio management
Procedia PDF Downloads 1415461 The Fabrication of Stress Sensing Based on Artificial Antibodies to Cortisol by Molecular Imprinted Polymer
Authors: Supannika Klangphukhiew, Roongnapa Srichana, Rina Patramanon
Abstract:
Cortisol has been used as a well-known commercial stress biomarker. A homeostasis response to psychological stress is indicated by an increased level of cortisol produced in hypothalamus-pituitary-adrenal (HPA) axis. Chronic psychological stress contributing to the high level of cortisol relates to several health problems. In this study, the cortisol biosensor was fabricated that mimicked the natural receptors. The artificial antibodies were prepared using molecular imprinted polymer technique that can imitate the performance of natural anti-cortisol antibody with high stability. Cortisol-molecular imprinted polymer (cortisol-MIP) was obtained using the multi-step swelling and polymerization protocol with cortisol as a target molecule combining methacrylic acid:acrylamide (2:1) with bisacryloyl-1,2-dihydroxy-1,2-ethylenediamine and ethylenedioxy-N-methylamphetamine as cross-linkers. Cortisol-MIP was integrated to the sensor. It was coated on the disposable screen-printed carbon electrode (SPCE) for portable electrochemical analysis. The physical properties of Cortisol-MIP were characterized by means of electron microscope techniques. The binding characteristics were evaluated via covalent patterns changing in FTIR spectra which were related to voltammetry response. The performance of cortisol-MIP modified SPCE was investigated in terms of detection range, high selectivity with a detection limit of 1.28 ng/ml. The disposable cortisol biosensor represented an application of MIP technique to recognize steroids according to their structures with feasibility and cost-effectiveness that can be developed to use in point-of-care.Keywords: stress biomarker, cortisol, molecular imprinted polymer, screen-printed carbon electrode
Procedia PDF Downloads 2725460 The Bayesian Premium Under Entropy Loss
Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita
Abstract:
Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation
Procedia PDF Downloads 3335459 Enhancement of 2, 4-Dichlorophenoxyacetic Acid Solubility via Solid Dispersion Technique
Authors: Tamer M. Shehata, Heba S. Elsewedy, Mashel Al Dosary, Alaa Elshehry, Mohamed A. Khedr, Maged E. Mohamed
Abstract:
Objective: 2,4-Dichlorophenoxy acetic acid (2,4-D) is a well-known herbicide widely used as a weed killer. Recently, 2,4-D was rediscovered as a new anti-inflammatory agent through in silico as well as in-vivo experiments. However, poor solubility of 2,4-D could represent a problems during pharmaceutical development in addition to lower bioavailability. Solid dispersion (SD) refers to a group of solid products consisting of at least two different components, usually a hydrophobic drug and hydrophilic matrix. It is well known technique for enhancing drug solubility. Therefore, selecting SD as a tool for enhancing 2,4-D could be of great interest to the formulator. Method: In our project, several polymers were investigated (such as PEG, HPMC, citric acid and others) in addition to drug polymer ratios and its effect on solubility. Evaluation of drug polymer interaction was investigated through both Fourier Transform Infrared (FTIR) and Differential Scanning Calorimetry (DSC). Finally, in-vivo evaluation was performed for the best selected preparation through inflammatory response of rat induce hind paw. Results: Results indicated that, citric acid 2,4-D and in ratio of 0.75 : 1 showed modified the dissolution profile of the drug. The FTIR resltes indicated no significant chemical interaction, however DSC showed shifting of the drug melting point. Finally, Carragenan induced rat hind paw edema showed significant reduction of the drug solid dispersion in comparison to the pure drug, indicating rapid and complete absorption of the drug in solid dispersion form. Conclusion: Solid dispersion technology can be utilized efficiently to enhance the solubility of 2,4-D.Keywords: solid dispersion, 2, 4-D solubility, carragenan induced edema
Procedia PDF Downloads 4515458 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids
Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao
Abstract:
An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.
Procedia PDF Downloads 1465457 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 2005456 Flexural Behavior of Eco-Friendly Prefabricated Low Cost Bamboo Reinforced Wall Panels
Authors: Vishal Puri, Pradipta Chakrabortty, Swapan Majumdar
Abstract:
Precast concrete construction is the most commonly used technique for a rapid construction. This technique is very frequently used in the developed countries. Different guidelines required to utilize the potential of prefabricated construction are still not available in the developing countries. This causes over dependence on in-situ construction procedure which further affects the quality, scheduling, and duration of construction. Also with the ever increasing costs of building materials and their negative impact on the environment it has become imperative to look out for alternate construction materials which are cheap and sustainable. Bamboo and fly ash are alternate construction materials having great potential in the construction industry. Thus there is a great need to develop prefabricated components by utilizing the potential of these materials. Bamboo reinforced beams, bamboo reinforced columns and bamboo arches as researched previously have shown great prospects for prefabricated construction industry. But, many other prefabricated components still need to be studied and widely tested before their utilization in the prefabricated construction industry. In the present study, authors have showcased prefabricated bamboo reinforced wall panel for the prefabricated construction industry. It presents a detailed methodology for the development of such prefabricated panels. It also presents the flexural behavior of such panels as tested under flexural loads following ASTM guidelines. It was observed that these wall panels are much flexible and do not show brittle failure as observed in traditional brick walls. It was observed that prefabricated walls are about 42% cheaper as compared to conventional brick walls. It was also observed that prefabricated walls are considerably lighter in weight and are environment friendly. It was thus concluded that this type of wall panels are an excellent alternative for partition brick walls.Keywords: bamboo, prefabricated walls, reinforced structure, sustainable infrastructure
Procedia PDF Downloads 3095455 Antibacterial Wound Dressing Based on Metal Nanoparticles Containing Cellulose Nanofibers
Authors: Mohamed Gouda
Abstract:
Antibacterial wound dressings based on cellulose nanofibers containing different metal nanoparticles (CMC-MNPs) were synthesized using an electrospinning technique. First, the composite of carboxymethyl cellulose containing different metal nanoparticles (CMC/MNPs), such as copper nanoparticles (CuNPs), iron nanoparticles (FeNPs), zinc nanoparticles (ZnNPs), cadmium nanoparticles (CdNPs) and cobalt nanoparticles (CoNPs) were synthesized, and finally, these composites were transferred to the electrospinning process. Synthesized CMC-MNPs were characterized using scanning electron microscopy (SEM) coupled with high-energy dispersive X-ray (EDX) and UV-visible spectroscopy used to confirm nanoparticle formation. The SEM images clearly showed regular flat shapes with semi-porous surfaces. All MNPs were well distributed inside the backbone of the cellulose without aggregation. The average particle diameters were 29-39 nm for ZnNPs, 29-33 nm for CdNPs, 25-33 nm for CoNPs, 23-27 nm for CuNPs and 22-26 nm for FeNPs. Surface morphology, water uptake and release of MNPs from the nanofibers in water and antimicrobial efficacy were studied. SEM images revealed that electrospun CMC-MNPs nanofibers are smooth and uniformly distributed without bead formation with average fiber diameters in the range of 300 to 450 nm. Fiber diameters were not affected by the presence of MNPs. TEM images showed that MNPs are present in/on the electrospun CMC-MNPs nanofibers. The diameter of the electrospun nanofibers containing MNPs was in the range of 300–450 nm. The MNPs were observed to be spherical in shape. The CMC-MNPs nanofibers showed good hydrophilic properties and had excellent antibacterial activity against the Gram-negative bacteria Escherichia coli and the Gram-positive bacteria Staphylococcus aureus.Keywords: electrospinning technique, metal nanoparticles, cellulosic nanofibers, wound dressing
Procedia PDF Downloads 3275454 A Vehicle Monitoring System Based on the LoRa Technique
Authors: Chao-Linag Hsieh, Zheng-Wei Ye, Chen-Kang Huang, Yeun-Chung Lee, Chih-Hong Sun, Tzai-Hung Wen, Jehn-Yih Juang, Joe-Air Jiang
Abstract:
Air pollution and climate warming become more and more intensified in many areas, especially in urban areas. Environmental parameters are critical information to air pollution and weather monitoring. Thus, it is necessary to develop a suitable air pollution and weather monitoring system for urban areas. In this study, a vehicle monitoring system (VMS) based on the IoT technique is developed. Cars are selected as the research tool because it can reach a greater number of streets to collect data. The VMS can monitor different environmental parameters, including ambient temperature and humidity, and air quality parameters, including PM2.5, NO2, CO, and O3. The VMS can provide other information, including GPS signals and the vibration information through driving a car on the street. Different sensor modules are used to measure the parameters and collect the measured data and transmit them to a cloud server through the LoRa protocol. A user interface is used to show the sensing data storing at the cloud server. To examine the performance of the system, a researcher drove a Nissan x-trail 1998 to the area close to the Da’an District office in Taipei to collect monitoring data. The collected data are instantly shown on the user interface. The four kinds of information are provided by the interface: GPS positions, weather parameters, vehicle information, and air quality information. With the VMS, users can obtain the information regarding air quality and weather conditions when they drive their car to an urban area. Also, government agencies can make decisions on traffic planning based on the information provided by the proposed VMS.Keywords: LoRa, monitoring system, smart city, vehicle
Procedia PDF Downloads 4145453 Detection of Glyphosate Using Disposable Sensors for Fast, Inexpensive and Reliable Measurements by Electrochemical Technique
Authors: Jafar S. Noori, Jan Romano-deGea, Maria Dimaki, John Mortensen, Winnie E. Svendsen
Abstract:
Pesticides have been intensively used in agriculture to control weeds, insects, fungi, and pest. One of the most commonly used pesticides is glyphosate. Glyphosate has the ability to attach to the soil colloids and degraded by the soil microorganisms. As glyphosate led to the appearance of resistant species, the pesticide was used more intensively. As a consequence of the heavy use of glyphosate, residues of this compound are increasingly observed in food and water. Recent studies reported a direct link between glyphosate and chronic effects such as teratogenic, tumorigenic and hepatorenal effects although the exposure was below the lowest regulatory limit. Today, pesticides are detected in water by complicated and costly manual procedures conducted by highly skilled personnel. It can take up to several days to get an answer regarding the pesticide content in water. An alternative to this demanding procedure is offered by electrochemical measuring techniques. Electrochemistry is an emerging technology that has the potential of identifying and quantifying several compounds in few minutes. It is currently not possible to detect glyphosate directly in water samples, and intensive research is underway to enable direct selective and quantitative detection of glyphosate in water. This study focuses on developing and modifying a sensor chip that has the ability to selectively measure glyphosate and minimize the signal interference from other compounds. The sensor is a silicon-based chip that is fabricated in a cleanroom facility with dimensions of 10×20 mm. The chip is comprised of a three-electrode configuration. The deposited electrodes consist of a 20 nm layer chromium and 200 nm gold. The working electrode is 4 mm in diameter. The working electrodes are modified by creating molecularly imprinted polymers (MIP) using electrodeposition technique that allows the chip to selectively measure glyphosate at low concentrations. The modification included using gold nanoparticles with a diameter of 10 nm functionalized with 4-aminothiophenol. This configuration allows the nanoparticles to bind to the working electrode surface and create the template for the glyphosate. The chip was modified using electrodeposition technique. An initial potential for the identification of glyphosate was estimated to be around -0.2 V. The developed sensor was used on 6 different concentrations and it was able to detect glyphosate down to 0.5 mgL⁻¹. This value is below the accepted pesticide limit of 0.7 mgL⁻¹ set by the US regulation. The current focus is to optimize the functionalizing procedure in order to achieve glyphosate detection at the EU regulatory limit of 0.1 µgL⁻¹. To the best of our knowledge, this is the first attempt to modify miniaturized sensor electrodes with functionalized nanoparticles for glyphosate detection.Keywords: pesticides, glyphosate, rapid, detection, modified, sensor
Procedia PDF Downloads 1765452 Three-Dimensional Vibration Characteristics of Piezoelectric Semi-Spherical Shell
Authors: Yu-Hsi Huang, Ying-Der Tsai
Abstract:
Piezoelectric circular plates can provide out-of-plane vibrational displacements on low frequency and in-plane vibrational displacements on high frequency. Piezoelectric semi-spherical shell, which is double-curvature structure, can induce three-dimensional vibrational displacements over a large frequency range. In this study, three-dimensional vibrational characteristics of piezoelectric semi-spherical shells with free boundary conditions are investigated using three experimental methods and finite element numerical modeling. For the experimental measurements, amplitude-fluctuation electronic speckle pattern interferometry (AF-ESPI) is used to obtain resonant frequencies and radial and azimuthal mode shapes. This optical technique utilizes a full-field and non-contact optical system that measures both the natural frequency and corresponding vibration mode shape simultaneously in real time. The second experimental technique used, laser displacement meter is a point-wise displacement measurement method that determines the resonant frequencies of the piezoelectric shell. An impedance analyzer is used to determine the in-plane resonant frequencies of the piezoelectric semi-spherical shell. The experimental results of the resonant frequencies and mode shapes for the piezoelectric shell are verified with the result from finite element analysis. Excellent agreement between the experimental measurements and numerical calculation is presented on the three-dimensional vibrational characteristics of the piezoelectric semi-spherical shell.Keywords: piezoelectric semi-spherical shell, mode shape, resonant frequency, electronic speckle pattern interferometry, radial vibration, azimuthal vibration
Procedia PDF Downloads 2325451 Homogenization of a Non-Linear Problem with a Thermal Barrier
Authors: Hassan Samadi, Mustapha El Jarroudi
Abstract:
In this work, we consider the homogenization of a non-linear problem in periodic medium with two periodic connected media exchanging a heat flux throughout their common interface. The interfacial exchange coefficient λ is assumed to tend to zero or to infinity following a rate λ=λ(ε) when the size ε of the basic cell tends to zero. Three homogenized problems are determined according to some critical value depending of λ and ε. Our method is based on Γ-Convergence techniques.Keywords: variational methods, epiconvergence, homogenization, convergence technique
Procedia PDF Downloads 5235450 An Effective Approach to Knowledge Capture in Whole Life Costing in Constructions Project
Authors: Ndibarafinia Young Tobin, Simon Burnett
Abstract:
In spite of the benefits of implementing whole life costing technique as a valuable approach for comparing alternative building designs allowing operational cost benefits to be evaluated against any initial cost increases and also as part of procurement in the construction industry, its adoption has been relatively slow due to the lack of tangible evidence, ‘know-how’ skills and knowledge of the practice, i.e. the lack of professionals in many establishments with knowledge and training on the use of whole life costing technique, this situation is compounded by the absence of available data on whole life costing from relevant projects, lack of data collection mechanisms and so on. This has proved to be very challenging to those who showed some willingness to employ the technique in a construction project. The knowledge generated from a project can be considered as best practices learned on how to carry out tasks in a more efficient way, or some negative lessons learned which have led to losses and slowed down the progress of the project and performance. Knowledge management in whole life costing practice can enhance whole life costing analysis execution in a construction project, as lessons learned from one project can be carried on to future projects, resulting in continuous improvement, providing knowledge that can be used in the operation and maintenance phases of an assets life span. Purpose: The purpose of this paper is to report an effective approach which can be utilised in capturing knowledge in whole life costing practice in a construction project. Design/methodology/approach: An extensive literature review was first conducted on the concept of knowledge management and whole life costing. This was followed by a semi-structured interview to explore the existing and good practice knowledge management in whole life costing practice in a construction project. The data gathered from the semi-structured interview was analyzed using content analysis and used to structure an effective knowledge capturing approach. Findings: From the results obtained in the study, it shows that the practice of project review is the common method used in the capturing of knowledge and should be undertaken in an organized and accurate manner, and results should be presented in the form of instructions or in a checklist format, forming short and precise insights. The approach developed advised that irrespective of how effective the approach to knowledge capture, the absence of an environment for sharing knowledge, would render the approach ineffective. Open culture and resources are critical for providing a knowledge sharing setting, and leadership has to sustain whole life costing knowledge capture, giving full support for its implementation. The knowledge capturing approach has been evaluated by practitioners who are experts in the area of whole life costing practice. The results have indicated that the approach to knowledge capture is suitable and efficient.Keywords: whole life costing, knowledge capture, project review, construction industry, knowledge management
Procedia PDF Downloads 2595449 A Comparative Study of the Effects of Vibratory Stress Relief and Thermal Aging on the Residual Stress of Explosives Materials
Authors: Xuemei Yang, Xin Sun, Cheng Fu, Qiong Lan, Chao Han
Abstract:
Residual stresses, which can be produced during the manufacturing process of plastic bonded explosive (PBX), play an important role in weapon system security and reliability. Residual stresses can and do change in service. This paper mainly studies the influence of vibratory stress relief (VSR) and thermal aging on residual stress of explosives. Firstly, the residual stress relaxation of PBX via different physical condition of VSR, such as vibration time, amplitude and dynamic strain, were studied by drill-hole technique. The result indicated that the vibratory amplitude, time and dynamic strain had a significant influence on the residual stress relief of PBX. The rate of residual stress relief of PBX increases first and then decreases with the increase of dynamic strain, amplitude and time, because the activation energy is too small to make the PBX yield plastic deformation at first. Then the dynamic strain, time and amplitude exceed a certain threshold, the residual stress changes show the same rule and decrease sharply, this sharply drop of residual stress relief rate may have been caused by over vibration. Meanwhile, the comparison between VSR and thermal aging was also studied. The conclusion is that the reduction ratio of residual stress after VSR process with applicable vibratory parameters could be equivalent to 73% of thermal aging with 7 days. In addition, the density attenuation rate, mechanical property, and dimensional stability with 3 months after VSR process was almost the same compared with thermal aging. However, compared with traditional thermal aging, VSR only takes a very short time, which greatly improves the efficiency of aging treatment for explosive materials. Therefore, the VSR could be a potential alternative technique in the industry of residual stress relaxation of PBX explosives.Keywords: explosives, residual stresses, thermal aging, vibratory stress relief, VSR
Procedia PDF Downloads 1585448 Laser Corneoplastique™: A Refractive Surgery for Corneal Scars
Authors: Arun C. Gulani, Aaishwariya A. Gulani, Amanda Southall
Abstract:
Background: Laser Corneoplastique™ as a least interventional, visually promising technique for patients with vision disability from corneal scars of varied causes has been retrospectively reviewed and proves to cause a paradigm shift in mindset and approach towards corneal scars as a Refractive surgery aiming for emmetropic, unaided vision of 20;/20 in most cases. Three decades of work on this technique has been compiled in this 15-year study. Subject and Methods: The objective of this study was to determine the success of Laser Corneoplastique™ surgery as a treatment of corneal scar cases. A survey of corneal scar cases caused by various medical histories that had undergone Laser Corneoplastique™ surgery over the past twenty years by a single surgeon Arun C. Gulani, M.D. were retrospectively reviewed. The details of each of the cases were retrieved from their medical records and analyzed. Each patient had been examined thoroughly at their preoperative appointments for stability of refraction and vision, depth of scar, pachymetry, topography, pattern of the scar and uncorrected and best corrected vision potential, which were all taken into account in the patients' treatment plans. Results: 64 eyes of 53 patients were investigated for scar etiology, keratometry, visual acuity, and complications. There were 25 different etiologies seen, with the most common being a Herpetic scar. The average visual acuity post-op was, on average, 20/23.55 (±7.05). Laser parameters used were depth and pulses. Overall, the mean Laser ablation depth was 30.67 (±19.05), ranging from 2 to 73 µm. Number of Laser pulses averaged 191.85 (±112.02). Conclusion: Refractive Laser Corneoplastique™ surgery, when practiced as an art, can address all levels of ametropia while reversing complex corneas and scars from refractive surgery complications back to 20/20 vision.Keywords: corneal scar, refractive surgery, corneal transplant, laser corneoplastique
Procedia PDF Downloads 1875447 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 2585446 X-Ray Diffraction, Microstructure, and Mössbauer Studies of Nanostructured Materials Obtained by High-Energy Ball Milling
Authors: N. Boudinar, A. Djekoun, A. Otmani, B. Bouzabata, J. M. Greneche
Abstract:
High-energy ball milling is a solid-state powder processing technique that allows synthesizing a variety of equilibrium and non-equilibrium alloy phases starting from elemental powders. The advantage of this process technology is that the powder can be produced in large quantities and the processing parameters can be easily controlled, thus it is a suitable method for commercial applications. It can also be used to produce amorphous and nanocrystalline materials in commercially relevant amounts and is also amenable to the production of a variety of alloy compositions. Mechanical alloying (high-energy ball milling) provides an inter-dispersion of elements through a repeated cold welding and fracture of free powder particles; the grain size decreases to nano metric scale and the element mix together. Progressively, the concentration gradients disappear and eventually the elements are mixed at the atomic scale. The end products depend on many parameters such as the milling conditions and the thermodynamic properties of the milled system. Here, the mechanical alloying technique has been used to prepare nano crystalline Fe_50 and Fe_64 wt.% Ni alloys from powder mixtures. Scanning electron microscopy (SEM) with energy-dispersive, X-ray analyses and Mössbauer spectroscopy were used to study the mixing at nanometric scale. The Mössbauer Spectroscopy confirmed the ferromagnetic ordering and was use to calculate the distribution of hyperfin field. The Mössbauer spectrum for both alloys shows the existence of a ferromagnetic phase attributed to γ-Fe-Ni solid solution.Keywords: nanocrystalline, mechanical alloying, X-ray diffraction, Mössbauer spectroscopy, phase transformations
Procedia PDF Downloads 4365445 The Real Ambassador: How Hip Hop Culture Connects and Educates across Borders
Authors: Frederick Gooding
Abstract:
This paper explores how many Hip Hop artists have intentionally and strategically invoked sustainability principles of people, planet and profits as a means to create community, compensate for and cope with structural inequalities in society. These themes not only create community within one's country, but the powerful display and demonstration of these narratives create community on a global plane. Listeners of Hip Hop are therefore able to learn about the political events occurring in another country free of censure, and establish solidarity worldwide. Hip Hop therefore can be an ingenious tool to create self-worth, recycle positive imagery, and serve as a defense mechanism from institutional and structural forces that conspire to make an upward economic and social trajectory difficult, if not impossible for many people of color, all across the world. Although the birthplace of Hip Hop, the United States of America, is still predominately White, it has undoubtedly grown more diverse at a breath-taking pace in recent decades. Yet, whether American mainstream media will fully reflect America’s newfound diversity remains to be seen. As it stands, American mainstream media is seen and enjoyed by diverse audiences not just in America, but all over the world. Thus, it is imperative that further inquiry is conducted about one of the fastest growing genres within one of the world’s largest and most influential media industries generating upwards of $10 billion annually. More importantly, hip hop, its music and associated culture collectively represent a shared social experience of significant value. They are important tools used both to inform and influence economic, social and political identity. Conversely, principles of American exceptionalism often prioritize American political issues over those of others, thereby rendering a myopic political view within the mainstream. This paper will therefore engage in an international contextualization of the global phenomena entitled Hip Hop by exploring the creative genius and marketing appeal of Hip Hop within the global context of information technology, political expression and social change in addition to taking a critical look at historically racialized imagery within mainstream media. Many artists the world over have been able to freely express themselves and connect with broader communities outside of their own borders, all through the sound practice of the craft of Hip Hop. An empirical understanding of political, social and economic forces within the United States will serve as a bridge for identifying and analyzing transnational themes of commonality for typically marginalized or disaffected communities facing similar struggles for survival and respect. The sharing of commonalities of marginalized cultures not only serves as a source of education outside of typically myopic, mainstream sources, but it also creates transnational bonds globally to the extent that practicing artists resonate with many of the original themes of (now mostly underground) Hip Hop as with many of the African American artists responsible for creating and fostering Hip Hop's powerful outlet of expression. Hip Hop's power of connectivity and culture-sharing transnationally across borders provides a key source of education to be taken seriously by academics.Keywords: culture, education, global, hip hop, mainstream music, transnational
Procedia PDF Downloads 1005444 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique
Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu
Abstract:
Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing
Procedia PDF Downloads 995443 Investigating the Prevalence of HCV from Laboratory Centers in Tehran City - Iran by Electrochemiluminescence (ECL) and PCR Techniques
Authors: Zahra Rakhshan Masoudi, Sona Rostampour Yasouri
Abstract:
Considering that the only way to save the lives of patients and healthy people who have suffered sudden accidents is blood transfusion, what is important is the presence of the known HCV virus as the most important cause of the disease after blood transfusion. HCV is one of the major global problems, and its transmission through blood causes life-threatening complications and extensive legal, social and economic consequences. On the one hand, unfortunately, there is still no effective vaccine available to prevent HCV. In Iran, the exact statistics of the prevalence of this disease have not yet been fully announced. The main purpose of this study is to investigate the prevalence rate and rapid diagnosis of HCV among those who refer to laboratory centers in Tehran. From spring to winter of 1401 (2022-2023), 2166 blood samples were collected from laboratory centers in Tehran. Blood samples were evaluated for the presence of HCV by Electrochemiluminescence (ECL) and PCR techniques along with specific HCV primers. In general, 36 samples (1.6%) were tested positive by the mentioned techniques. The results indicated that the ECL technique is a sensitive and specific diagnostic method for detecting HCV in the early stages of the disease and can be very helpful and provide the possibility of starting the treatment steps to prevent the exacerbation of the disease earlier. Also, the results of PCR technique showed that PCR is an accurate, sensitive and fast method for definitive diagnosis of HCV. It seems that the incidence rate of this disease is increasing in Iran, and investigating the spread of the disease throughout Iran for a longer period of time in the continuation of our research can be helpful in the future to take the necessary measures to prevent the transmission of the disease to people and the rapid onset Treatment steps for patients with HCV should be carried out.Keywords: electrochemiluminescence, HCV, PCR, prevalence
Procedia PDF Downloads 675442 Use of Cold In-Place Asphalt Mixtures Technique in Road Maintenance in Egypt
Authors: Mohammed Mamdouh Mohammed Hussein, Ali Zain Elabdeen Heikal, Hassan Abdel Zaher Hassan Mahdy, Sherif Masoud Ahmed El Badawy
Abstract:
The main purpose of this research is to assess the effectiveness of the Cold In-Place Recycling (CIR) technique in asphalt maintenance by analyzing performance outcomes. To achieve this, fifteen CIR mixtures were prepared using slow-setting emulsified asphalt as the recycling agent, with percentages ranging from 2% to 4% in 0.5% increments. Additionally, pure water was incorporated in percentages ranging from 2% to 4% in 1% increments, and Portland cement was added at a constant content of 1%. The components were mixed at room temperature and subsequently compacted using a gyratory compactor with 150 gyrations. Prior to testing, the samples underwent a two-stage treatment process: initially, they were placed in an oven at 60°C for 48 hours, followed by a 24-hour period of air curing. The Hamburg wheel tracking test was performed to evaluate the samples’ resistance to rutting. Additionally, the Indirect Tensile Strength (ITS) test and the Semi-Circular Beam (SCB) test were conducted to assess their resistance to cracking. Upon analyzing the test results, it was observed that the samples’ resistance to rutting decreased with higher asphalt and moisture content. In contrast, ITS and SCB tests revealed that the samples’ resistance to cracking initially increased with higher asphalt and moisture content, peaking at a certain point, and then decreased, forming a bell-curve pattern.Keywords: cold in-place, indirect tensile strength, recycling, emulsified asphalt, semi-circular beam
Procedia PDF Downloads 75441 Precoding-Assisted Frequency Division Multiple Access Transmission Scheme: A Cyclic Prefixes- Available Modulation-Based Filter Bank Multi-Carrier Technique
Authors: Ying Wang, Jianhong Xiang, Yu Zhong
Abstract:
The offset Quadrature Amplitude Modulation-based Filter Bank Multi-Carrier (FBMC) system provides superior spectral properties over Orthogonal Frequency Division Multiplexing. However, seriously affected by imaginary interference, its performances are hampered in many areas. In this paper, we propose a Precoding-Assisted Frequency Division Multiple Access (PA-FDMA) modulation scheme. By spreading FBMC symbols into the frequency domain and transmitting them with a precoding matrix, the impact of imaginary interference can be eliminated. Specifically, we first generate the coding pre-solution matrix with a nonuniform Fast Fourier Transform and pick the best columns by introducing auxiliary factors. Secondly, according to the column indexes, we obtain the precoding matrix for one symbol and impose scaling factors to ensure that the power is approximately constant throughout the transmission time. Finally, we map the precoding matrix of one symbol to multiple symbols and transmit multiple data frames, thus achieving frequency-division multiple access. Additionally, observing the interference between adjacent frames, we mitigate them by adding frequency Cyclic Prefixes (CP) and evaluating them with a signal-to-interference ratio. Note that PA-FDMA can be considered a CP-available FBMC technique because the underlying strategy is FBMC. Simulation results show that the proposed scheme has better performance compared to Single Carrier Frequency Division Multiple Access (SC-FDMA), etc.Keywords: PA-FDMA, SC-FDMA, FBMC, non-uniform fast fourier transform
Procedia PDF Downloads 62