Search results for: computational methods
15656 Auditing of Building Information Modeling Application in Decoration Engineering Projects in China
Authors: Lan Luo
Abstract:
In China’s construction industry, it is a normal practice to separately subcontract the decoration engineering part from construction engineering, and Building Information Modeling (BIM) is also done separately. Application of BIM in decoration engineering should be integrated with other disciplines, but Chinese current practice makes this very difficult and complicated. Currently, there are three barriers in the auditing of BIM application in decoration engineering in China: heavy workload; scarcity of qualified professionals; and lack of literature concerning audit contents, standards, and methods. Therefore, it is significant to perform research on what (contents) should be evaluated, in which phase, and by whom (professional qualifications) in BIM application in decoration construction so that the application of BIM can be promoted in a better manner. Based on this consideration, four principles of BIM auditing are proposed: Comprehensiveness of information, accuracy of data, aesthetic attractiveness of appearance, and scheme optimization. In the model audit, three methods should be used: Collision, observation, and contrast. In addition, BIM auditing at six stages is discussed and a checklist for work items and results to be submitted is proposed. This checklist can be used for reference by decoration project participants.Keywords: audit, evaluation, dimensions, methods, standards, BIM application in decoration engineering projects
Procedia PDF Downloads 34315655 Experimental Analysis for the Inlet of the Brazilian Aerospace Vehicle 14-X B
Authors: João F. A. Martos, Felipe J. Costa, Sergio N. P. Laiton, Bruno C. Lima, Israel S. Rêgo, Paulo P. G. Toro
Abstract:
Nowadays, the scramjet is a topic that has attracted the attention of several scientific communities (USA, Australia, Germany, France, Japan, India, China, Russia), that are investing in this in this type of propulsion system due its interest to facilitate access to space and reach hypersonic speed, who have invested in this type of propulsion due to the interest in facilitating access to space. The Brazilian hypersonic scramjet aerospace vehicle 14-X B is a technological demonstrator of a hypersonic airbreathing propulsion system based on the supersonic combustion (scramjet) intended to be tested in flight into the Earth's atmosphere at 30 km altitude and Mach number 7. The 14-X B has been designed at the Prof. Henry T. Nagamatsu Laboratory of Aerothermodynamics and Hypersonics of the Institute for Advanced Studies (IEAv) in Brazil. The IEAv Hypersonic Shock Tunnel, named T3, is a ground-test facility able to reproduce the flight conditions as the Mach number as well as pressure and temperature in the test section close to those encountered during the test flight of the vehicle 14-X B into design conditions. A 1-m long stainless steel 14-X B model was experimentally investigated at T3 Hypersonic Shock Tunnel, for freestream Mach number 7. Static pressure measurements along the lower surface of the 14-X B model, along with high-speed schlieren photographs taken from the 5.5° leading edge and the 14.5° deflection compression ramp, provided experimental data that were compared to the analytical-theoretical solutions and the computational fluid dynamics (CFD) simulations. The results show a good qualitative agreement, and in consequence demonstrating the importance of these methods in the project of the 14-X B hypersonic aerospace vehicle.Keywords: 14-X, CFD, hypersonic, hypersonic shock tunnel, scramjet
Procedia PDF Downloads 36015654 Improving Efficiencies of Planting Configurations on Draft Environment of Town Square: The Case Study of Taichung City Hall in Taichung, Taiwan
Authors: Yu-Wen Huang, Yi-Cheng Chiang
Abstract:
With urban development, lots of buildings are built around the city. The buildings always affect the urban wind environment. The accelerative situation of wind caused of buildings often makes pedestrians uncomfortable, even causes the accidents and dangers. Factors influencing pedestrian level wind including atmospheric boundary layer, wind direction, wind velocity, planting, building volume, geometric shape of the buildings and adjacent interference effects, etc. Planting has many functions including scraping and slowing urban heat island effect, creating a good visual landscape, increasing urban green area and improve pedestrian level wind. On the other hand, urban square is an important space element supporting the entrance to buildings, city landmarks, and activity collections, etc. The appropriateness of urban square environment usually dominates its success. This research focuses on the effect of tree-planting on the wind environment of urban square. This research studied the square belt of Taichung City Hall. Taichung City Hall is a cuboid building with a large mass opening. The square belt connects the front square, the central opening and the back square. There is often wind draft on the square belt. This phenomenon decreases the activities on the squares. This research applies tree-planting to improve the wind environment and evaluate the effects of two types of planting configuration. The Computational Fluid Dynamics (CFD) simulation analysis and extensive field measurements are applied to explore the improve efficiency of planting configuration on wind environment. This research compares efficiencies of different kinds of planting configuration, including the clustering array configuration and the dispersion, and evaluates the efficiencies by the SET*.Keywords: micro-climate, wind environment, planting configuration, comfortableness, computational fluid dynamics (CFD)
Procedia PDF Downloads 31115653 Parallel Multisplitting Methods for DAE’s
Authors: Ahmed Machmoum, Malika El Kyal
Abstract:
We consider iterative parallel multi-splitting method for differential algebraic equations. The main feature of the proposed idea is to use the asynchronous form. We prove that the multi-splitting technique can effectively accelerate the convergent performance of the iterative process. The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays tobe substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.Keywords: computer, multi-splitting methods, asynchronous mode, differential algebraic systems
Procedia PDF Downloads 54915652 Improving the Penalty-free Multi-objective Evolutionary Design Optimization of Water Distribution Systems
Authors: Emily Kambalame
Abstract:
Water distribution networks necessitate many investments for construction, prompting researchers to seek cost reduction and efficient design solutions. Optimization techniques are employed in this regard to address these challenges. In this context, the penalty-free multi-objective evolutionary algorithm (PFMOEA) coupled with pressure-dependent analysis (PDA) was utilized to develop a multi-objective evolutionary search for the optimization of water distribution systems (WDSs). The aim of this research was to find out if the computational efficiency of the PFMOEA for WDS optimization could be enhanced. This was done by applying real coding representation and retaining different percentages of feasible and infeasible solutions close to the Pareto front in the elitism step of the optimization. Two benchmark network problems, namely the Two-looped and Hanoi networks, were utilized in the study. A comparative analysis was then conducted to assess the performance of the real-coded PFMOEA in relation to other approaches described in the literature. The algorithm demonstrated competitive performance for the two benchmark networks by implementing real coding. The real-coded PFMOEA achieved the novel best-known solutions ($419,000 and $6.081 million) and a zero-pressure deficit for the two networks, requiring fewer function evaluations than the binary-coded PFMOEA. In previous PFMOEA studies, elitism applied a default retention of 30% of the least cost-feasible solutions while excluding all infeasible solutions. It was found in this study that by replacing 10% and 15% of the feasible solutions with infeasible ones that are close to the Pareto front with minimal pressure deficit violations, the computational efficiency of the PFMOEA was significantly enhanced. The configuration of 15% feasible and 15% infeasible solutions outperformed other retention allocations by identifying the optimal solution with the fewest function evaluationKeywords: design optimization, multi-objective evolutionary, penalty-free, water distribution systems
Procedia PDF Downloads 6315651 Effect of Brewing on the Bioactive Compounds of Coffee
Authors: Ceyda Dadali, Yeşim Elmaci
Abstract:
Coffee was introduced as an economic crop during the fifteenth century; nowadays it is the most important food commodity ranking second after crude oil. Desirable sensory properties make coffee one of the most often consumed and most popular beverages in the world. The coffee preparation method has a significant effect on flavor and composition of coffee brews. Three different extraction methodologies namely decoction, infusion and pressure methods have been used for coffee brew preparation. Each of these methods is related to specific granulation (coffee grind) of coffee powder, water-coffee ratio temperature and brewing time. Coffee is a mixture of 1500 chemical compounds. Chemical composition of coffee highly depends on brewing methods, coffee bean species and roasting time-temperature. Coffee contains a wide number of very important bioactive compounds, such as diterpenes: cafestol and kahweol, alkaloids: caffeine, theobromine and trigonelline, melanoidins, phenolic compounds. The phenolic compounds of coffee include chlorogenic acids (quinyl esters of hidroxycinnamic acids), caffeic, ferulic, p-coumaric acid. In coffee caffeoylquinic acids, feruloylquinic acids and di-caffeoylquinic acids are three main groups of chlorogenic acids constitues 6% -10% of dry weight of coffee. The bioavailability of chlorogenic acids in coffee depends on the absorption and metabolization to biomarkers in individuals. Also, the interaction of coffee polyphenols with other compounds such as dietary proteins affects the biomarkers. Since bioactive composition of coffee depends on brewing methods effect of coffee brewing method on bioactive compounds of coffee will be discussed in this study.Keywords: bioactive compounds of coffee, biomarkers, coffee brew, effect of brewing
Procedia PDF Downloads 19615650 Modelling of Heat Transfer during Controlled Cooling of Thermo-Mechanically Treated Rebars Using Computational Fluid Dynamics Approach
Authors: Rohit Agarwal, Mrityunjay K. Singh, Soma Ghosh, Ramesh Shankar, Biswajit Ghosh, Vinay V. Mahashabde
Abstract:
Thermo-mechanical treatment (TMT) of rebars is a critical process to impart sufficient strength and ductility to rebar. TMT rebars are produced by the Tempcore process, involves an 'in-line' heat treatment in which hot rolled bar (temperature is around 1080°C) is passed through water boxes where it is quenched under high pressure water jets (temperature is around 25°C). The quenching rate dictates composite structure consisting (four non-homogenously distributed phases of rebar microstructure) pearlite-ferrite, bainite, and tempered martensite (from core to rim). The ferrite and pearlite phases present at core induce ductility to rebar while martensitic rim induces appropriate strength. The TMT process is difficult to model as it brings multitude of complex physics such as heat transfer, highly turbulent fluid flow, multicomponent and multiphase flow present in the control volume. Additionally the presence of film boiling regime (above Leidenfrost point) due to steam formation adds complexity to domain. A coupled heat transfer and fluid flow model based on computational fluid dynamics (CFD) has been developed at product technology division of Tata Steel, India which efficiently predicts temperature profile and percentage martensite rim thickness of rebar during quenching process. The model has been validated with 16 mm rolling of New Bar mill (NBM) plant of Tata Steel Limited, India. Furthermore, based on the scenario analyses, optimal configuration of nozzles was found which helped in subsequent increase in rolling speed.Keywords: boiling, critical heat flux, nozzles, thermo-mechanical treatment
Procedia PDF Downloads 21715649 A Review on Application of Waste Tire in Concrete
Authors: M. A. Yazdi, J. Yang, L. Yihui, H. Su
Abstract:
The application of recycle waste tires into civil engineering practices, namely asphalt paving mixtures and cementbased materials has been gaining ground across the world. This review summarizes and compares the recent achievements in the area of plain rubberized concrete (PRC), in details. Different treatment methods have been discussed to improve the performance of rubberized Portland cement concrete. The review also includes the effects of size and amount of tire rubbers on mechanical and durability properties of PRC. The microstructure behaviour of the rubberized concrete was detailed.Keywords: waste rubber aggregates, microstructure, treatment methods, size and content effects
Procedia PDF Downloads 33515648 Optical Whitening of Textiles: Teaching and Learning Materials
Authors: C. W. Kan
Abstract:
This study examines the results of optical whitening process of different textiles such as cotton, wool and polyester. The optical whitening agents used are commercially available products, and the optical whitening agents were applied to the textiles with manufacturers’ suggested methods. The aim of this study is to illustrate the proper application methods of optical whitening agent to different textiles and hence to provide guidance note to the students in learning this topic. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.Keywords: learning materials, optical whitening agent, wool, cotton, polyester
Procedia PDF Downloads 42715647 Linguistic Cyberbullying, a Legislative Approach
Authors: Simona Maria Ignat
Abstract:
Bullying online has been an increasing studied topic during the last years. Different approaches, psychological, linguistic, or computational, have been applied. To our best knowledge, a definition and a set of characteristics of phenomenon agreed internationally as a common framework are still waiting for answers. Thus, the objectives of this paper are the identification of bullying utterances on Twitter and their algorithms. This research paper is focused on the identification of words or groups of words, categorized as “utterances”, with bullying effect, from Twitter platform, extracted on a set of legislative criteria. This set is the result of analysis followed by synthesis of law documents on bullying(online) from United States of America, European Union, and Ireland. The outcome is a linguistic corpus with approximatively 10,000 entries. The methods applied to the first objective have been the following. The discourse analysis has been applied in identification of keywords with bullying effect in texts from Google search engine, Images link. Transcription and anonymization have been applied on texts grouped in CL1 (Corpus linguistics 1). The keywords search method and the legislative criteria have been used for identifying bullying utterances from Twitter. The texts with at least 30 representations on Twitter have been grouped. They form the second corpus linguistics, Bullying utterances from Twitter (CL2). The entries have been identified by using the legislative criteria on the the BoW method principle. The BoW is a method of extracting words or group of words with same meaning in any context. The methods applied for reaching the second objective is the conversion of parts of speech to alphabetical and numerical symbols and writing the bullying utterances as algorithms. The converted form of parts of speech has been chosen on the criterion of relevance within bullying message. The inductive reasoning approach has been applied in sampling and identifying the algorithms. The results are groups with interchangeable elements. The outcomes convey two aspects of bullying: the form and the content or meaning. The form conveys the intentional intimidation against somebody, expressed at the level of texts by grammatical and lexical marks. This outcome has applicability in the forensic linguistics for establishing the intentionality of an action. Another outcome of form is a complex of graphemic variations essential in detecting harmful texts online. This research enriches the lexicon already known on the topic. The second aspect, the content, revealed the topics like threat, harassment, assault, or suicide. They are subcategories of a broader harmful content which is a constant concern for task forces and legislators at national and international levels. These topic – outcomes of the dataset are a valuable source of detection. The analysis of content revealed algorithms and lexicons which could be applied to other harmful contents. A third outcome of content are the conveyances of Stylistics, which is a rich source of discourse analysis of social media platforms. In conclusion, this corpus linguistics is structured on legislative criteria and could be used in various fields.Keywords: corpus linguistics, cyberbullying, legislation, natural language processing, twitter
Procedia PDF Downloads 8615646 A Critical Reflection of Ableist Methodologies: Approaching Interviews and Go-Along Interviews
Authors: Hana Porkertová, Pavel Doboš
Abstract:
Based on a research project studying the experience of visually disabled people with urban space in the Czech Republic, the conference contribution discusses the limits of social-science methodologies used in sociology and human geography. It draws on actor-network theory, assuming that science does not describe reality but produces it. Methodology connects theory, research questions, ways to answer them (methods), and results. A research design utilizing ableist methodologies can produce ableist realities. Therefore, it was necessary to adjust the methods so that they could mediate blind experience to the scientific community without reproducing ableism. The researchers faced multiple challenges, ranging from questionable validity to how to research experience that differs from that of the researchers who are able-bodied. Finding a suitable theory that could be used as an analytical tool that would demonstrate space and blind experience as multiple, dynamic, and mutually constructed was the first step that could offer a range of potentially productive methods and research questions, as well as bring critically reflected results. Poststructural theory, mainly Deleuze-Guattarian philosophy, was chosen, and two methods were used: interviews and go-along interviews that had to be adjusted to be able to explore blind experience. In spite of a thorough preparation of these methods, new difficulties kept emerging, which exposed the ableist character of scientific knowledge. From the beginning of data collecting, there was an agreement to work in teams with slightly different roles of each of the researchers, which was significant especially during go-along interviews. In some cases, the anticipations of the researchers and participants differed, which led to unexpected and potentially dangerous situations. These were not caused only by the differences between scientific and lay communities but also between able-bodied and disabled people. Researchers were sometimes assigned to the assistants’ roles, and this new position – doing research together – required further negotiations, which also opened various ethical questions.Keywords: ableist methodology, blind experience, go-along interviews, research ethics, scientific knowledge
Procedia PDF Downloads 16615645 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis
Authors: Mehrnaz Mostafavi
Abstract:
The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans
Procedia PDF Downloads 10315644 Simulation of Bird Strike on Airplane Wings by Using SPH Methodology
Authors: Tuğçe Kiper Elibol, İbrahim Uslan, Mehmet Ali Guler, Murat Buyuk, Uğur Yolum
Abstract:
According to the FAA report, 142603 bird strikes were reported for a period of 24 years, between 1990 – 2013. Bird strike with aerospace structures not only threaten the flight security but also cause financial loss and puts life in danger. The statistics show that most of the bird strikes are happening with the nose and the leading edge of the wings. Also, a substantial amount of bird strikes is absorbed by the jet engines and causes damage on blades and engine body. Crash proof designs are required to overcome the possibility of catastrophic failure of the airplane. Using computational methods for bird strike analysis during the product development phase has considerable importance in terms of cost saving. Clearly, using simulation techniques to reduce the number of reference tests can dramatically affect the total cost of an aircraft, where for bird strike often full-scale tests are considered. Therefore, development of validated numerical models is required that can replace preliminary tests and accelerate the design cycle. In this study, to verify the simulation parameters for a bird strike analysis, several different numerical options are studied for an impact case against a primitive structure. Then, a representative bird mode is generated with the verified parameters and collided against the leading edge of a training aircraft wing, where each structural member of the wing was explicitly modeled. A nonlinear explicit dynamics finite element code, LS-DYNA was used for the bird impact simulations. SPH methodology was used to model the behavior of the bird. Dynamic behavior of the wing superstructure was observed and will be used for further design optimization purposes.Keywords: bird impact, bird strike, finite element modeling, smoothed particle hydrodynamics
Procedia PDF Downloads 32815643 CFD Study for Normal and Rifled Tube with a Convergence Check
Authors: Sharfi Dirar, Shihab Elhaj, Ahmed El Fatih
Abstract:
Computational fluid dynamics were used to simulate and study the heated water boiler tube for both normal and rifled tube with a refinement of the mesh to check the convergence. The operation condition was taken from GARRI power station and used in a boundary condition accordingly. The result indicates the rifled tube has higher heat transfer efficiency than the normal tube.Keywords: boiler tube, convergence check, normal tube, rifled tube
Procedia PDF Downloads 33515642 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques
Authors: Stefan K. Behfar
Abstract:
The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing
Procedia PDF Downloads 7815641 Structural Behavior of Subsoil Depending on Constitutive Model in Calculation Model of Pavement Structure-Subsoil System
Authors: M. Kadela
Abstract:
The load caused by the traffic movement should be transferred in the road constructions in a harmless way to the pavement as follows: − on the stiff upper layers of the structure (e.g. layers of asphalt: abrading and binding), and − through the layers of principal and secondary substructure, − on the subsoil, directly or through an improved subsoil layer. Reliable description of the interaction proceeding in a system “road construction – subsoil” should be in such case one of the basic requirements of the assessment of the size of internal forces of structure and its durability. Analyses of road constructions are based on: − elements of mechanics, which allows to create computational models, and − results of the experiments included in the criteria of fatigue life analyses. Above approach is a fundamental feature of commonly used mechanistic methods. They allow to use in the conducted evaluations of the fatigue life of structures arbitrarily complex numerical computational models. Considering the work of the system “road construction – subsoil”, it is commonly accepted that, as a result of repetitive loads on the subsoil under pavement, the growth of relatively small deformation in the initial phase is recognized, then this increase disappears, and the deformation takes the character completely reversible. The reliability of calculation model is combined with appropriate use (for a given type of analysis) of constitutive relationships. Phenomena occurring in the initial stage of the system “road construction – subsoil” is unfortunately difficult to interpret in the modeling process. The classic interpretation of the behavior of the material in the elastic-plastic model (e-p) is that elastic phase of the work (e) is undergoing to phase (e-p) by increasing the load (or growth of deformation in the damaging structure). The paper presents the essence of the calibration process of cooperating subsystem in the calculation model of the system “road construction – subsoil”, created for the mechanistic analysis. Calibration process was directed to show the impact of applied constitutive models on its deformation and stress response. The proper comparative base for assessing the reliability of created. This work was supported by the on-going research project “Stabilization of weak soil by application of layer of foamed concrete used in contact with subsoil” (LIDER/022/537/L-4/NCBR/2013) financed by The National Centre for Research and Development within the LIDER Programme. M. Kadela is with the Department of Building Construction Elements and Building Structures on Mining Areas, Building Research Institute, Silesian Branch, Katowice, Poland (phone: +48 32 730 29 47; fax: +48 32 730 25 22; e-mail: m.kadela@ itb.pl). models should be, however, the actual, monitored system “road construction – subsoil”. The paper presents too behavior of subsoil under cyclic load transmitted by pavement layers. The response of subsoil to cyclic load is recorded in situ by the observation system (sensors) installed on the testing ground prepared for this purpose, being a part of the test road near Katowice, in Poland. A different behavior of the homogeneous subsoil under pavement is observed for different seasons of the year, when pavement construction works as a flexible structure in summer, and as a rigid plate in winter. Albeit the observed character of subsoil response is the same regardless of the applied load and area values, this response can be divided into: - zone of indirect action of the applied load; this zone extends to the depth of 1,0 m under the pavement, - zone of a small strain, extending to about 2,0 m.Keywords: road structure, constitutive model, calculation model, pavement, soil, FEA, response of soil, monitored system
Procedia PDF Downloads 35715640 Design and Computational Fluid Dynamics Analysis of Aerodynamic Package of a Formula Student Car
Authors: Aniketh Ravukutam, Rajath Rao M., Pradyumna S. A.
Abstract:
In the past few decades there has been great advancement in use of aerodynamics in cars. Now its use has been evident from commercial cars to race cars for achieving higher speeds, stability and efficiency. This paper focusses on studying the effects of aerodynamics in Formula Student car. These cars weigh around 200kgs with an average speed of 60kmph. With increasing competition every year, developing a competitive car is a herculean task. The race track comprises mostly of tight corners and little or no straights thus testing the car’s cornering capabilities. Higher cornering speeds can be achieved by increasing traction at the tires. Studying the aerodynamics helps in achieving higher traction without much addition in overall weight of car. The main focus is to develop an aerodynamic package involving front wing, under tray and body to obtain an optimum value of down force. The initial process involves the detail study of geometrical constraints mentioned in the rule book and calculating the limiting value of drag as per the engine specifications. The successive steps involve conduction of various iterations in ANSYS for selection of airfoils, deciding the number of elements, designing the nose for low drag, channelizing the flow under the body and obtain an optimum value of down force within the limits defined in the initial process. The final step involves design of model using these results in Virtual environment called OptimumLap® for detailed study of performance with and without the presence of aerodynamics. The CFD analysis results showed an overall down force of 377.44N with a drag of 164.08N. The corresponding parameters of the last model were applied in OptimumLap® and an improvement of 3.5 seconds in lap times was observed.Keywords: aerodynamics, formula student, traction, front wing, undertray, body, rule book, drag, down force, virtual environment, computational fluid dynamics (CFD)
Procedia PDF Downloads 24115639 Disease Characteristics of Neurofibromatosis Type II and Cochlear Implantation
Authors: Boxiang Zhuang
Abstract:
This study analyzes the clinical manifestations, hearing rehabilitation methods and outcomes of a complex case of neurofibromatosis type II (NF2). Methods: The clinical manifestations, medical history, clinical data, surgical methods and postoperative hearing rehabilitation outcomes of an NF2 patient were analyzed to determine the hearing reconstruction method and postoperative effect for a special type of NF2 acoustic neuroma. Results: The patient had bilateral acoustic neuromas with profound sensorineural hearing loss in both ears. Peripheral blood genetic testing did not reveal pathogenic gene mutations, suggesting mosaicism. The patient had an intracochlear schwannoma in the right ear and severely impaired vision in both eyes. Cochlear implantation with tumor retention was performed in the right ear. After 2 months of family-based auditory and speech rehabilitation, the Categories of Auditory Performance (CAP) score improved from 0 to 5. Conclusion: NF2 has complex clinical manifestations and poor prognosis. For NF2 patients with intracochlear tumors, cochlear implantation with tumor retention can be used to reconstruct hearing.Keywords: NF2, intracochlear schwannoma, hearing reconstruction, cochlear implantation
Procedia PDF Downloads 1415638 Refined Edge Detection Network
Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni
Abstract:
Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone
Procedia PDF Downloads 10315637 The Impact of Ultrasonic Field to Increase the Biodegradability of Leachate from The Landfill
Authors: Kwarciak-Kozlowska A., Slawik-Dembiczak L., Galwa-Widera M.
Abstract:
Complex and variable during operation of the landfill leachate composition prevents the use of a single universal method of their purification. Due to the presence of difficult biodegradable these substances in the wastewater, cleaning of them often requires the use of biological methods (activated sludge or anaerobic digestion), also often supporting by physicochemical processes. Currently, more attention is paid to the development of unconventional methods of disposal of sewage m.in ultleniania advanced methods including the use of ultrasonic waves. It was assumed that the ultrasonic waves induce change in the structure of organic compounds and contribute to the acceleration of biodegradability, including refractive substances in the leachate, so that will increase the effectiveness of their treatment in biological processes. We observed a marked increase in BOD leachate when subjected to the action of utradźwięowego. Ratio BOD / COD was 27% higher compared to the value of this ratio for leachate nienadźwiękawianych. It was found that the process of sonification leachate clearly influenced the formation and release of aliphatic compounds. These changes suggest a possible violation of the chemical structure of organic compounds in the leachate thereby give compounds of the chemical structure more susceptible to biodegradation.Keywords: IR spectra, landfill leachate, organic pollutants, ultrasound
Procedia PDF Downloads 42915636 Determination of Metalaxyl Efficacy in Controlling Phytophthora palmivora Infection of Durian Using Bioassay
Authors: Supawadee Phetkhajone, Wisuwat Songnuan
Abstract:
Metalaxyl is one of the most common and effective fungicides used to control Phytophthora palmivora infection in durian (Durio zibethinus L.). The efficacy of metalaxyl residue in durian under greenhouse condition was evaluated using bioassay. Durian seedlings were treated with 2 methods of application, spraying, and soil drenching of metalaxyl, at recommended concentration (1000 mg/L). Mock treated samples were treated with 0.1% Tween20 and water for spraying and soil drenching methods, respectively. The experiment was performed in triplicates. Leaves were detached from treated plants at 0, 1, 7, 15, 20, 30, and 60 days after application, inoculated with metalaxyl-resistant and metalaxyl-sensitive isolates of P. palmivora, and incubated in a high humidity chamber for 5 days at room temperature. Metalaxyl efficacy was determined by measuring the lesion size on metalaxyl treated and mock treated samples. The results showed that metalaxyl can control metalaxyl-sensitive isolate of P. palmivora for at least 30 days after application in both methods of application. The metalaxyl-resistant isolate was not inhibited in all treatments. Leaf samples from spraying method showed larger lesions compared to soil drench method. These results demonstrated that metalaxyl applications, especially soil drenching methods showed high efficacy to control metalaxyl-sensitive isolates of P. palmivora, although it cannot control metalaxyl-resistant isolates of P. palmivora in all treatments. These qualitative data indicate that metalaxyl may suitable to control metalaxyl-sensitive isolates of P. palmivora infection.Keywords: bioassay, degradation, durian, metalaxyl
Procedia PDF Downloads 12615635 The Anatomy and Characteristics of Online Romance Scams
Authors: Danuvasin Charoen
Abstract:
Online romance scams are conducted by criminals using social networks and dating sites. These criminals use love to deceive the victims to send them money. The victims not only lose money to the criminals, but they are also heartbroken. This study investigates how online romance scams work and why people become victims to them. The researcher also identifies the characteristics of the perpetrators and victims. The data were collected from in-depth interviews with former victims and police officers responsible for the cases. By studying the methods and characteristics of the online romance scam, we can develop effective methods and policies to reduce the rates of such crimes.Keywords: romance scam, online scam, phishing, cybercrime
Procedia PDF Downloads 15915634 Cannabis Sativa L as Natural Source of Promising Anti-Alzheimer Drug Candidates: A Comprehensive Computational Approach Including Molecular Docking, Molecular Dynamics, Admet and MM-PBSA Studies
Authors: Hassan Nour, Nouh Mounadi, Oussama Abchir, Belaidi Salah, Samir Chtita
Abstract:
Cholinesterase enzymes are biological catalysts essential for the transformation of acetylcholine, which is a neurotransmitter implicated in memory and learning, into acetic acid and choline, altering the neurotransmission process in Alzheimer’s disease patients. Therefore, inhibition of cholinesterase enzymes is a relevant strategy for the symptomatic treatment of Alzheimer’s disease. The current investigation aims to explore potential Cholinesterase (ChE) inhibitors through a comprehensive computational approach. Forty-nine phytoconstituents extracted from Cannabis sativa L were in-silico screened using molecular docking, pharmacokinetic and toxicological analysis to evaluate their possible inhibitory effect towards the cholinesterase enzymes. Two phytoconstituents belonging to cannabinoid derivatives were revealed to be promising candidates for Alzheimer therapy by acting as cholinesterase inhibitors. They have exhibited high binding affinities towards the cholinesterase enzymes and showed their ability to interact with key residues involved in cholinesterase enzymatic activity. In addition, they presented good ADMET profiles allowing them to be promising oral drug candidates. Furthermore, molecular dynamics (MD) simulations were executed to explore their interactions stability under mimetic biological conditions and thus support our findings. To corroborate the docking results, the binding free energy corresponding to the more stable ligand-ChE complexes was re-estimated by applying the MM-PBSA method. MD and MM-PBSA studies affirmed that the ligand-ChE recognition is spontaneous reaction leading to stable complexes. The conducted investigations have led to great findings that would strongly guide the pharmaceutical industries towards the rational development of potent anti-Alzheimer agents.Keywords: alzheimer’s disease, molecular docking, cannabis sativa l, cholinesterase inhibitors
Procedia PDF Downloads 7415633 Environmental Restoration Science in New York Harbor - Community Based Restoration Science Hubs, or “STEM Hubs”
Authors: Lauren B. Birney
Abstract:
The project utilizes the Billion Oyster Project (BOP-CCERS) place-based “restoration through education” model to promote computational thinking in NYC high school teachers and their students. Key learning standards such as Next Generation Science Standards and the NYC CS4All Equity and Excellence initiative are used to develop a computer science curriculum that connects students to their Harbor through hands-on activities based on BOP field science and educational programming. Project curriculum development is grounded in BOP-CCERS restoration science activities and data collection, which are enacted by students and educators at two Restoration Science STEM Hubs or conveyed through virtual materials. New York City Public School teachers with relevant experience are recruited as consultants to provide curriculum assessment and design feedback. The completed curriculum units are then conveyed to NYC high school teachers through professional learning events held at the Pace University campus and led by BOP educators. In addition, Pace University educators execute the Summer STEM Institute, an intensive two-week computational thinking camp centered on applying data analysis tools and methods to BOP-CCERS data. Both qualitative and quantitative analyses were performed throughout the five-year study. STEM+C – Community Based Restoration STEM Hubs. STEM Hubs are active scientific restoration sites capable of hosting school and community groups of all grade levels and professional scientists and researchers conducting long-term restoration ecology research. The STEM Hubs program has grown to include 14 STEM Hubs across all five boroughs of New York City and focuses on bringing in-field monitoring experience as well as coastal classroom experience to students. Restoration Science STEM Hubs activities resulted in: the recruitment of 11 public schools, 6 community groups, 12 teachers, and over 120 students receiving exposure to BOP activities. Field science protocols were designed exclusively around the use of the Oyster Restoration Station (ORS), a small-scale in situ experimental platforms which are suspended from a dock or pier. The ORS is intended to be used and “owned” by an individual school, teacher, class, or group of students, whereas the STEM Hub is explicitly designed as a collaborative space for large-scale community-driven restoration work and in-situ experiments. The ORS is also an essential tool in gathering Harbor data from disparate locations and instilling ownership of the research process amongst students. As such, it will continue to be used in that way. New and previously participating students will continue to deploy and monitor their own ORS, uploading data to the digital platform and conducting analysis of their own harbor-wide datasets. Programming the STEM Hub will necessitate establishing working relationships between schools and local research institutions. NYHF will provide introductions and the facilitation of initial workshops in school classrooms. However, once a particular STEM Hub has been established as a space for collaboration, each partner group, school, university, or CBO will schedule its own events at the site using the digital platform’s scheduling and registration tool. Monitoring of research collaborations will be accomplished through the platform’s research publication tool and has thus far provided valuable information on the projects’ trajectory, strategic plan, and pathway.Keywords: environmental science, citizen science, STEM, technology
Procedia PDF Downloads 9815632 DNA-Polycation Condensation by Coarse-Grained Molecular Dynamics
Authors: Titus A. Beu
Abstract:
Many modern gene-delivery protocols rely on condensed complexes of DNA with polycations to introduce the genetic payload into cells by endocytosis. In particular, polyethyleneimine (PEI) stands out by a high buffering capacity (enabling the efficient condensation of DNA) and relatively simple fabrication. Realistic computational studies can offer essential insights into the formation process of DNA-PEI polyplexes, providing hints on efficient designs and engineering routes. We present comprehensive computational investigations of solvated PEI and DNA-PEI polyplexes involving calculations at three levels: ab initio, all-atom (AA), and coarse-grained (CG) molecular mechanics. In the first stage, we developed a rigorous AA CHARMM (Chemistry at Harvard Macromolecular Mechanics) force field (FF) for PEI on the basis of accurate ab initio calculations on protonated model pentamers. We validated this atomistic FF by matching the results of extensive molecular dynamics (MD) simulations of structural and dynamical properties of PEI with experimental data. In a second stage, we developed a CG MARTINI FF for PEI by Boltzmann inversion techniques from bead-based probability distributions obtained from AA simulations and ensuring an optimal match between the AA and CG structural and dynamical properties. In a third stage, we combined the developed CG FF for PEI with the standard MARTINI FF for DNA and performed comprehensive CG simulations of DNA-PEI complex formation and condensation. Various technical aspects which are crucial for the realistic modeling of DNA-PEI polyplexes, such as options of treating electrostatics and the relevance of polarizable water models, are discussed in detail. Massive CG simulations (with up to 500 000 beads) shed light on the mechanism and provide time scales for DNA polyplex formation independence of PEI chain size and protonation pattern. The DNA-PEI condensation mechanism is shown to primarily rely on the formation of DNA bundles, rather than by changes of the DNA-strand curvature. The gained insights are expected to be of significant help for designing effective gene-delivery applications.Keywords: DNA condensation, gene-delivery, polyethylene-imine, molecular dynamics.
Procedia PDF Downloads 12015631 A Combinatorial Approach of Treatment for Landfill Leachate
Authors: Anusha Atmakuri, R. D. Tyagi, Patrick Drogui
Abstract:
Landfilling is the most familiar and easy way to dispose solid waste. Landfill is generally received via wastes from municipal near to a landfill. The waste collected is from commercial, industrial, and residential areas and many more. Landfill leachate (LFL) is formed when rainwater passes through the waste placed in landfills and consists of several dissolved organic materials, for instance, aquatic humic substances (AHS), volatile fatty acids (VFAs), heavy metals, inorganic macro components, and xenobiotic organic matters, highly toxic to the environment. These components of LFL put a load on it, hence it necessitates the treatment of LFL prior to its discharge into the environment. Various methods have been used to treat LFL over the years, such as physical, chemical, biological, physicochemical, electrical, and advanced oxidation methods. This study focuses on the combination of biological and electrochemical methods- extracellular polymeric substances and electrocoagulation(EC). The coupling of electro-coagulation process with extracellular polymeric substances (EPS) (as flocculant) as pre and\or post treatment strategy provides efficient and economical process for the decontamination of landfill leachate contaminated with suspended matter, metals (e.g., Fe, Mn) and ammonical nitrogen. Electro-coagulation and EPS mediated coagulation approach could be an economically viable for the treatment of landfill leachate, along with possessing several other advantages over several other methods. This study utilised waste substrates such as activated sludge, crude glycerol and waste cooking oil for the production of EPS using fermentation technology. A comparison of different scenarios for the treatment of landfill leachate is presented- such as using EPS alone as bioflocculant, EPS and EC with EPS being the 1st stage, and EPS and EC with EC being the 1st stage. The work establishes the use of crude EPS as a bioflocculant for the treatment of landfill leachate and wastewater from a site near a landfill, along with EC being successful in removal of some major pollutants such as COD, turbidity, total suspended solids. A combination of these two methods is to be explored more for the complete removal of all pollutants from landfill leachate.Keywords: landfill leachate, extracellular polymeric substances, electrocoagulation, bioflocculant.
Procedia PDF Downloads 8615630 CMMI Key Process Areas and FDD Practices
Authors: Rituraj Deka, Nomi Baruah
Abstract:
The development of information technology during the past few years resulted in designing of more and more complex software. The outsourcing of software development makes a higher requirement for the management of software development project. Various software enterprises follow various paths in their pursuit of excellence, applying various principles, methods and techniques along the way. The new research is proving that CMMI and Agile methodologies can benefit from using both methods within organizations with the potential to dramatically improve business performance. The paper describes a mapping between CMMI key process areas (KPAs) and Feature-Driven Development (FDD) communication perspective, so as to increase the understanding of how improvements can be made in the software development process.Keywords: Agile, CMMI, FDD, KPAs
Procedia PDF Downloads 45915629 Historical Studies on Gilt Decorations on Glazed Surfaces
Authors: Sabra Saeidi
Abstract:
This research focuses on the historical techniques associated with the lajevardina and Haft-Rangi production methods in creating tiles, with emphasis on the identification of the techniques of inserting gold sheets on the surface of such historical glazed tiles. In this regard, firstly, the history of the production of enamel, gold plated, and Lajevardina glazed pottery work made during the Khwarizmanshahid and Mongol era (eleventh to the thirteenth century) have been assessed to reach a better understanding of the background and the history associated with historical glazing methods. After the historical overview of the production technique of glazed pottery work and introductions of the civilizations using those techniques, we focused on the niches production methods of enamel and Lajevardina glazing, which are two categories of decorations usually found in tiles. Next, a general classification method for various types of gilt tiles has been introduced, which is applicable to the tile works up to Safavid period (Sixteenth to the seventeenth century). Gilded lajevardina glazed tiles, gilt Haft-Rangi tiles, monolithic glazed gilt tiles, and gilt mosaic tiles are included in the categories.Keywords: gilt tiles, Islamic art, Iranian art, historical studies, gilding
Procedia PDF Downloads 12315628 Graphical Modeling of High Dimension Processes with an Environmental Application
Authors: Ali S. Gargoum
Abstract:
Graphical modeling plays an important role in providing efficient probability calculations in high dimensional problems (computational efficiency). In this paper, we address one of such problems where we discuss fragmenting puff models and some distributional assumptions concerning models for the instantaneous, emission readings and for the fragmenting process. A graphical representation in terms of a junction tree of the conditional probability breakdown of puffs and puff fragments is proposed.Keywords: graphical models, influence diagrams, junction trees, Bayesian nets
Procedia PDF Downloads 39615627 Evaluating Structural Crack Propagation Induced by Soundless Chemical Demolition Agent Using an Energy Release Rate Approach
Authors: Shyaka Eugene
Abstract:
The efficient and safe demolition of structures is a critical challenge in civil engineering and construction. This study focuses on the development of optimal demolition strategies by investigating the crack propagation behavior in beams induced by soundless cracking agents. It is commonly used in controlled demolition and has gained prominence due to its non-explosive and environmentally friendly nature. This research employs a comprehensive experimental and computational approach to analyze the crack initiation, propagation, and eventual failure in beams subjected to soundless cracking agents. Experimental testing involves the application of various cracking agents under controlled conditions to understand their effects on the structural integrity of beams. High-resolution imaging and strain measurements are used to capture the crack propagation process. In parallel, numerical simulations are conducted using advanced finite element analysis (FEA) techniques to model crack propagation in beams, considering various parameters such as cracking agent composition, loading conditions, and beam properties. The FEA models are validated against experimental results, ensuring their accuracy in predicting crack propagation patterns. The findings of this study provide valuable insights into optimizing demolition strategies, allowing engineers and demolition experts to make informed decisions regarding the selection of cracking agents, their application techniques, and structural reinforcement methods. Ultimately, this research contributes to enhancing the safety, efficiency, and sustainability of demolition practices in the construction industry, reducing environmental impact and ensuring the protection of adjacent structures and the surrounding environment.Keywords: expansion pressure, energy release rate, soundless chemical demolition agent, crack propagation
Procedia PDF Downloads 63