Search results for: computer processing of large databases
11215 Impact of ICT on Efficient Services Providing to Users by LIPs in NCR India
Authors: Mani Gupta
Abstract:
This study deals with question: i) Whether ICT plays a positive role in improvement of efficiency of LIPs in terms of providing efficient services to the Users in LICs? and ii) Role of finance in terms of required technological logistics and infrastructure for usage of ICT based services to comfort in accessing databases by Users in LICs. This is based on primary data which are collected from various libraries and Information Centers of NCR Delhi. The survey conducted during December 15 and 31, 2010 on 496 respondents across 96 libraries and information centers in NCR Delhi through electronic data collection method. There is positive and emphatic relationship between ICT and its effect on improving the level of efficient services providing by LIPs in LICs in NCR Delhi. This is divided into 6 sub-headings and finally the outcomes.Keywords: modern globalization, linear correlation, efficient service, internet revolution, logistics
Procedia PDF Downloads 35711214 Seismic Behavior of Concrete Filled Steel Tube Reinforced Concrete Column
Authors: Raghabendra Yadav, Baochun Chen, Huihui Yuan, Zhibin Lian
Abstract:
Pseudo-dynamic test (PDT) method is an advanced seismic test method that combines loading technology with computer technology. Large-scale models or full scale seismic tests can be carried out by using this method. CFST-RC columns are used in civil engineering structures because of their better seismic performance. A CFST-RC column is composed of four CFST limbs which are connected with RC web in longitudinal direction and with steel tube in transverse direction. For this study, a CFST-RC pier is tested under Four different earthquake time histories having scaled PGA of 0.05g. From the experiment acceleration, velocity, displacement and load time histories are observed. The dynamic magnification factors for acceleration due to Elcentro, Chi-Chi, Imperial Valley and Kobe ground motions are observed as 15, 12, 17 and 14 respectively. The natural frequency of the pier is found to be 1.40 Hz. The result shows that this type of pier has excellent static and earthquake resistant properties.Keywords: bridge pier, CFST-RC pier, pseudo dynamic test, seismic performance, time history
Procedia PDF Downloads 18511213 Thermal Analysis and Computational Fluid Dynamics Simulation of Large-Scale Cryopump
Authors: Yue Shuai Zhao, Rong Ping Shao, Wei Sun, Guo Hua Ren, Yong Wang, Li Chen Sun
Abstract:
A large-scale cryopump (DN1250) used in large vacuum leak detecting system was designed and its performance experimentally investigated by Beijing Institute of Spacecraft Environment Engineering. The cryopump was cooled by four closed cycle helium refrigerators (two dual stage refrigerators and two single stage refrigerators). Detailed numerical analysis of the heat transfer in the first stage array and the second stage array were performed by using computational fluid dynamic method (CFD). Several design parameters were considered to find the effect on the temperature distribution and the cooldown time. The variation of thermal conductivity and heat capacity with temperature was taken into account. The thermal analysis method based on numerical techniques was introduced in this study, the heat transfer in the first stage array and the second stage cryopanel was carefully analyzed to determine important considerations in the thermal design of the cryopump. A performance test system according to the RNEUROP standards was built to test main performance of the cryopump. The experimental results showed that the structure of first stage array which was optimized by the method could meet the requirement of the cryopump well. The temperature of the cryopanel was down to 10K within 300 min, and the result of the experiment was accordant with theoretical analysis' conclusion. The test also showed that the pumping speed for N2 of the pump was up to 57,000 L/s, and the crossover was over than 300,000 Pa•L.Keywords: cryopump, temperature distribution, thermal analysis, CFD Simulation
Procedia PDF Downloads 30411212 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques
Authors: Stefan K. Behfar
Abstract:
The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing
Procedia PDF Downloads 7611211 The Usage of Nitrogen Gas and Alum for Sludge Dewatering
Authors: Mamdouh Yousef Saleh, Medhat Hosny El-Zahar, Shymaa El-Dosoky
Abstract:
In most cases, the associated processing cost of dewatering sludge increase with the solid particles concentration. All experiments in this study were conducted on biological sludge type. All experiments help to reduce the greenhouse gases in addition, the technology used was faster in time and less in cost compared to other methods. First, the bubbling pressure was used to dissolve N₂ gas into the sludge, second alum was added to accelerate the process of coagulation of the sludge particles and facilitate their flotation, and third nitrogen gas was used to help floating the sludge particles and reduce the processing time because of the nitrogen gas from the inert gases. The conclusions of this experiment were as follows: first, the best conditions were obtained when the bubbling pressure was 0.6 bar. Second, the best alum dose was determined to help the sludge agglomerate and float. During the experiment, the best alum dose was 80 mg/L. It increased concentration of the sludge by 7-8 times. Third, the economic dose of nitrogen gas was 60 mg/L with separation efficiency of 85%. The sludge concentration was about 8-9 times. That happened due to the gas released tiny bubbles which adhere to the suspended matter causing them to float to the surface of the water where it could be then removed.Keywords: nitrogen gas, biological treatment, alum, dewatering sludge, greenhouse gases
Procedia PDF Downloads 21711210 The Impact of Recurring Events in Fake News Detection
Authors: Ali Raza, Shafiq Ur Rehman Khan, Raja Sher Afgun Usmani, Asif Raza, Basit Umair
Abstract:
Detection of Fake news and missing information is gaining popularity, especially after the advancement in social media and online news platforms. Social media platforms are the main and speediest source of fake news propagation, whereas online news websites contribute to fake news dissipation. In this study, we propose a framework to detect fake news using the temporal features of text and consider user feedback to identify whether the news is fake or not. In recent studies, the temporal features in text documents gain valuable consideration from Natural Language Processing and user feedback and only try to classify the textual data as fake or true. This research article indicates the impact of recurring and non-recurring events on fake and true news. We use two models BERT and Bi-LSTM to investigate, and it is concluded from BERT we get better results and 70% of true news are recurring and rest of 30% are non-recurring.Keywords: natural language processing, fake news detection, machine learning, Bi-LSTM
Procedia PDF Downloads 2211209 The Use of Computers in Improving the Academic Performance of Students in Mathematics
Authors: Uwaruile Austin Obuh
Abstract:
This research work focuses on the use of computers in improving the academic performance of students in mathematics in Benin City, Edo State. To guide this study, two research questions were raised, and two corresponding hypotheses were formulated. A total of one hundred and twenty (120) respondents were randomly selected from four schools in the city (60 boys and 60 girls). The instrument employed for the collation of data for the study was the multiple-choice test items on geometry (MCTIOG), drawn from past senior school certificate examinations (SSCE) questions. The instrument was validated by an expert in mathematics and measurement and evaluation. The data obtained from the pre and post-test were analysed using the mean, standard deviation, and T-test. The study revealed a non-significant difference between the experimental and control group in the pre-test, and the two groups were found to be the same before treatment began. The study also revealed that the experimental group performed better than the control group. One can, therefore, conclude that the use of computers for mathematics instruction has improved the performance of students in Geometry. Therefore, the hypothesis was rejected. The study finally revealed that there was no significant difference between the boys and girls taught mathematics using a computer. Therefore, the hypothesis which states there will be no significant difference in the performance of boys and girls taught mathematics using the computer was not rejected. Consequent upon the findings of this study, a number of recommendations were postulated that would enhance the performance of teachers in the use of computer-aided instruction.Keywords: computer, teaching, learning, mathematics
Procedia PDF Downloads 12411208 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs
Authors: Taysir Soliman
Abstract:
One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph
Procedia PDF Downloads 18911207 Virtual Dimension Analysis of Hyperspectral Imaging to Characterize a Mining Sample
Authors: L. Chevez, A. Apaza, J. Rodriguez, R. Puga, H. Loro, Juan Z. Davalos
Abstract:
Virtual Dimension (VD) procedure is used to analyze Hyperspectral Image (HIS) treatment-data in order to estimate the abundance of mineral components of a mining sample. Hyperspectral images coming from reflectance spectra (NIR region) are pre-treated using Standard Normal Variance (SNV) and Minimum Noise Fraction (MNF) methodologies. The endmember components are identified by the Simplex Growing Algorithm (SVG) and after adjusted to the reflectance spectra of reference-databases using Simulated Annealing (SA) methodology. The obtained abundance of minerals of the sample studied is very near to the ones obtained using XRD with a total relative error of 2%.Keywords: hyperspectral imaging, minimum noise fraction, MNF, simplex growing algorithm, SGA, standard normal variance, SNV, virtual dimension, XRD
Procedia PDF Downloads 15811206 AI Applications in Accounting: Transforming Finance with Technology
Authors: Alireza Karimi
Abstract:
Artificial Intelligence (AI) is reshaping various industries, and accounting is no exception. With the ability to process vast amounts of data quickly and accurately, AI is revolutionizing how financial professionals manage, analyze, and report financial information. In this article, we will explore the diverse applications of AI in accounting and its profound impact on the field. Automation of Repetitive Tasks: One of the most significant contributions of AI in accounting is automating repetitive tasks. AI-powered software can handle data entry, invoice processing, and reconciliation with minimal human intervention. This not only saves time but also reduces the risk of errors, leading to more accurate financial records. Pattern Recognition and Anomaly Detection: AI algorithms excel at pattern recognition. In accounting, this capability is leveraged to identify unusual patterns in financial data that might indicate fraud or errors. AI can swiftly detect discrepancies, enabling auditors and accountants to focus on resolving issues rather than hunting for them. Real-Time Financial Insights: AI-driven tools, using natural language processing and computer vision, can process documents faster than ever. This enables organizations to have real-time insights into their financial status, empowering decision-makers with up-to-date information for strategic planning. Fraud Detection and Prevention: AI is a powerful tool in the fight against financial fraud. It can analyze vast transaction datasets, flagging suspicious activities and reducing the likelihood of financial misconduct going unnoticed. This proactive approach safeguards a company's financial integrity. Enhanced Data Analysis and Forecasting: Machine learning, a subset of AI, is used for data analysis and forecasting. By examining historical financial data, AI models can provide forecasts and insights, aiding businesses in making informed financial decisions and optimizing their financial strategies. Artificial Intelligence is fundamentally transforming the accounting profession. From automating mundane tasks to enhancing data analysis and fraud detection, AI is making financial processes more efficient, accurate, and insightful. As AI continues to evolve, its role in accounting will only become more significant, offering accountants and finance professionals powerful tools to navigate the complexities of modern finance. Embracing AI in accounting is not just a trend; it's a necessity for staying competitive in the evolving financial landscape.Keywords: artificial intelligence, accounting automation, financial analysis, fraud detection, machine learning in finance
Procedia PDF Downloads 6311205 IIROC's Enforcement Performance: Funnel in, Funnel out, and Funnel away
Authors: Mark Lokanan
Abstract:
The paper analyzes the processing of complaints against investment brokers and dealer members through the Investment Industry Regulatory Organization of Canada (IIROC) from 2008 to 2017. IIROC is the self-regulatory organization (SRO) that is responsible for policing investment dealers and brokerage firms that trade in Canada’s securities market. Data from the study came from IIROC's enforcement annual reports for the years examined. The case processing is evaluated base on the misconduct funnel that was originally designed for street crime and applies to the enforcement of investment fraud. The misconduct funnel is used as a framework to examine IIROC’s claim that it brought in more complaints (funnel in) than government regulators and shows how these complaints are funneled out and funneled away as they are processed through IIROC’s enforcement system. The results indicate that IIROC is ineffective in disciplining its members and is unable to handle the more serious quasi-criminal and improper sales practices offenses. It is hard not to see the results of the paper being used by the legislator in Ottawa to show the importance of a federal securities regulatory agency such as the Securities and Exchange Commission (SEC) in the United States.Keywords: investment fraud, securities regulation, compliance, enforcement
Procedia PDF Downloads 16011204 Expert System: Debugging Using MD5 Process Firewall
Authors: C. U. Om Kumar, S. Kishore, A. Geetha
Abstract:
An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization
Procedia PDF Downloads 42711203 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 29911202 Data Mining Meets Educational Analysis: Opportunities and Challenges for Research
Authors: Carla Silva
Abstract:
Recent development of information and communication technology enables us to acquire, collect, analyse data in various fields of socioeconomic – technological systems. Along with the increase of economic globalization and the evolution of information technology, data mining has become an important approach for economic data analysis. As a result, there has been a critical need for automated approaches to effective and efficient usage of massive amount of educational data, in order to support institutions to a strategic planning and investment decision-making. In this article, we will address data from several different perspectives and define the applied data to sciences. Many believe that 'big data' will transform business, government, and other aspects of the economy. We discuss how new data may impact educational policy and educational research. Large scale administrative data sets and proprietary private sector data can greatly improve the way we measure, track, and describe educational activity and educational impact. We also consider whether the big data predictive modeling tools that have emerged in statistics and computer science may prove useful in educational and furthermore in economics. Finally, we highlight a number of challenges and opportunities for future research.Keywords: data mining, research analysis, investment decision-making, educational research
Procedia PDF Downloads 35811201 Detecting Paraphrases in Arabic Text
Authors: Amal Alshahrani, Allan Ramsay
Abstract:
Paraphrasing is one of the important tasks in natural language processing; i.e. alternative ways to express the same concept by using different words or phrases. Paraphrases can be used in many natural language applications, such as Information Retrieval, Machine Translation, Question Answering, Text Summarization, or Information Extraction. To obtain pairs of sentences that are paraphrases we create a system that automatically extracts paraphrases from a corpus, which is built from different sources of news article since these are likely to contain paraphrases when they report the same event on the same day. There are existing simple standard approaches (e.g. TF-IDF vector space, cosine similarity) and alignment technique (e.g. Dynamic Time Warping (DTW)) for extracting paraphrase which have been applied to the English. However, the performance of these approaches could be affected when they are applied to another language, for instance Arabic language, due to the presence of phenomena which are not present in English, such as Free Word Order, Zero copula, and Pro-dropping. These phenomena will affect the performance of these algorithms. Thus, if we can analysis how the existing algorithms for English fail for Arabic then we can find a solution for Arabic. The results are promising.Keywords: natural language processing, TF-IDF, cosine similarity, dynamic time warping (DTW)
Procedia PDF Downloads 38611200 Graphic Procession Unit-Based Parallel Processing for Inverse Computation of Full-Field Material Properties Based on Quantitative Laser Ultrasound Visualization
Authors: Sheng-Po Tseng, Che-Hua Yang
Abstract:
Motivation and Objective: Ultrasonic guided waves become an important tool for nondestructive evaluation of structures and components. Guided waves are used for the purpose of identifying defects or evaluating material properties in a nondestructive way. While guided waves are applied for evaluating material properties, instead of knowing the properties directly, preliminary signals such as time domain signals or frequency domain spectra are first revealed. With the measured ultrasound data, inversion calculation can be further employed to obtain the desired mechanical properties. Methods: This research is development of high speed inversion calculation technique for obtaining full-field mechanical properties from the quantitative laser ultrasound visualization system (QLUVS). The quantitative laser ultrasound visualization system (QLUVS) employs a mirror-controlled scanning pulsed laser to generate guided acoustic waves traveling in a two-dimensional target. Guided waves are detected with a piezoelectric transducer located at a fixed location. With a gyro-scanning of the generation source, the QLUVS has the advantage of fast, full-field, and quantitative inspection. Results and Discussions: This research introduces two important tools to improve the computation efficiency. Firstly, graphic procession unit (GPU) with large amount of cores are introduced. Furthermore, combining the CPU and GPU cores, parallel procession scheme is developed for the inversion of full-field mechanical properties based on the QLUVS data. The newly developed inversion scheme is applied to investigate the computation efficiency for single-layered and double-layered plate-like samples. The computation efficiency is shown to be 80 times faster than unparalleled computation scheme. Conclusions: This research demonstrates a high-speed inversion technique for the characterization of full-field material properties based on quantitative laser ultrasound visualization system. Significant computation efficiency is shown, however not reaching the limit yet. Further improvement can be reached by improving the parallel computation. Utilizing the development of the full-field mechanical property inspection technology, full-field mechanical property measured by non-destructive, high-speed and high-precision measurements can be obtained in qualitative and quantitative results. The developed high speed computation scheme is ready for applications where full-field mechanical properties are needed in a nondestructive and nearly real-time way.Keywords: guided waves, material characterization, nondestructive evaluation, parallel processing
Procedia PDF Downloads 20211199 Leveraging Sentiment Analysis for Quality Improvement in Digital Healthcare Services
Authors: Naman Jain, Shaun Fernandes
Abstract:
With the increasing prevalence of online healthcare services, selecting the most suitable doctor has become a complex task, requiring careful consideration of both public sentiment and personal preferences. This paper proposes a sentiment analysis-driven method that integrates public reviews with user-specific criteria and correlated attributes to recommend online doctors. By leveraging Natural Language Processing (NLP) techniques, public sentiment is extracted from online reviews, which is then combined with user-defined preferences such as specialty, years of experience, location, and consultation fees. Additionally, correlated attributes like education and certifications are incorporated to enhance the recommendation accuracy. Experimental results demonstrate that the proposed system significantly improves user satisfaction by providing personalized doctor recommendations that align with both public opinion and individual needs.Keywords: sentiment analysis, online doctors, personal preferences, correlated attributes, recommendation system, healthcare, natural language processing
Procedia PDF Downloads 511198 A Comprehensive Review on Structural Properties and Erection Benefits of Large Span Stressed-Arch Steel Truss Industrial Buildings
Authors: Anoush Saadatmehr
Abstract:
Design and build of large clear span structures have always been demanding in the construction industry targeting industrial and commercial buildings around the world. The function of these spectacular structures encompasses distinguished types of building such as aircraft and airship hangars, warehouses, bulk storage buildings, sports and recreation facilities. From an engineering point of view, there are various types of steel structure systems that are often adopted in large-span buildings like conventional trusses, space frames and cable-supported roofs. However, this paper intends to investigate and review an innovative light, economic and quickly erected large span steel structure renowned as “Stressed-Arch,” which has several advantages over the other common types of structures. This patented system integrates the use of cold-formed hollow section steel material with high-strength pre-stressing strands and concrete grout to establish an arch shape truss frame anywhere there is a requirement to construct a cost-effective column-free space for spans within the range of 60m to 180m. In this study and firstly, the main structural properties of the stressed-arch system and its components are discussed technically. These features include nonlinear behavior of truss chords during stress-erection, the effect of erection method on member’s compressive strength, the rigidity of pre-stressed trusses to overcome strict deflection criteria for cases with roof suspended cranes or specialized front doors and more importantly, the prominent lightness of steel structure. Then, the effects of utilizing pre-stressing strands to safeguard a smooth process of installation of main steel members and roof components and cladding are investigated. In conclusion, it is shown that the Stressed-Arch system not only provides an optimized light steel structure up to 30% lighter than its conventional competitors but also streamlines the process of building erection and minimizes the construction time while preventing the risks of working at height.Keywords: large span structure, pre-stressed steel truss, stressed-arch building, stress-erection, steel structure
Procedia PDF Downloads 16411197 Concentrations of Some Metallic Trace Elements in Twelve Sludge Incineration Ashes
Authors: Lotfi Khiari, Antoine Karam, Claude-Alla Joseph, Marc Hébert
Abstract:
The main objective of incineration of sludge generated from municipal or agri-food waste treatment plant is to reduce the volume of sludge to be disposed of as a solid or liquid waste, whilst concentrating or destroying potentially harmful volatile substances. In some cities in Canada and United States of America (USA), a large amount of sludge is incinerated, which entails a loss of organic matter and water leading to phosphorus, potassium and some metallic trace element (MTE) accumulation in ashes. The purpose of this study was to evaluate the concentration of potentially hazardous MTE such as cadmium (Cd), lead (Pb) and mercury (Hg) in twelve sludge incineration ash samples obtained from municipal wastewater and other food processing waste treatments from Canada and USA. The average, maximum, and minimum values of MTE in ashes were calculated for each city individually and all together. The trace metal concentration values were compared to the literature reported values. The concentrations of MTE in ashes vary widely depending on the sludge origins and treatment options. The concentrations of MTE in ashes were found the range of 0.1-6.4 mg/kg for Cd; 13-286 mg/kg for Pb and 0.1-0.5 mg/kg for Hg. On average, the following order of metal concentration in ashes was observed: Pb > Cd > Hg. Results show that metal contents in most ashes were similar to MTE levels in synthetic inorganic fertilizers and many fertilizing residual materials. Consequently, the environmental effects of MTE content of these ashes would be low.Keywords: biosolids, heavy metals, recycling, sewage sludge
Procedia PDF Downloads 38011196 Magnesium Alloys Containing Y, Gd and Ca with Enhanced Ignition Temperature and Mechanical Properties for Aviation Applications
Authors: Jiří Kubásek, Peter Minárik, Klára Hosová, Stanislav Šašek, Jozef Veselý, Jitka Stráská, Drahomír Dvorský, Dalibor Vojtěch, Miloš Janeček
Abstract:
Mg-2Y-2Gd-1Ca and Mg-4Y-4Gd-2Ca alloys were processed by extrusion or equal channel angular pressing (ECAP) to analyse the effect of the microstructure on ignition temperature, mechanical properties and corrosion resistance. The alloys are characterized by good mechanical properties and exceptionally high ignition temperature, which is a critical safety measure. The effect of extrusion and ECAP on the microstructure, mechanical properties and ignition temperature was studied. The obtained results indicated a substantial effect of the processing conditions on the average grain size, the recrystallized fraction and texture formation. Both alloys featured a high strength, depending on the composition and processing condition, and a high ignition temperature of ≈1100 °C (Mg-4Y-4Gd-2Ca) and ≈950 °C (Mg-2Y-2Gd-1Ca), which was attributed to the synergic effect of Y, Gd and Ca oxides, with the dominant effect of Y₂O₃. The achieved combination of enhanced mechanical properties and the ignition temperature makes these alloys a prominent candidate for aircraft applications.Keywords: magnesium alloys, enhanced ignition temperature, mechanical properties, ECAP
Procedia PDF Downloads 10911195 Study the Influence of the Type of Cast Iron Chips on the Quality of Briquettes Obtained with Controlled Impact
Authors: Dimitar N. Karastoianov, Stanislav D. Gyoshev, Todor N. Penchev
Abstract:
Preparation of briquettes of metal chips with good density and quality is of great importance for the efficiency of this process. In this paper are presented the results of impact briquetting of grey cast iron chips with rectangular shape and dimensions 15x25x1 mm. Density and quality of briquettes of these chips are compared with those obtained in another work of the authors using cast iron chips with smaller sizes. It has been found that by using a rectangular chips with a large size are produced briquettes with a very low density and poor quality. From the photographs taken by X-ray tomography, it is clear that the reason for this is the orientation of the chip in the peripheral wall of the briquettes, which does not allow of the air to escape from it. It was concluded that in order to obtain briquettes of cast iron chips with a large size, these chips must first be ground, for example in a small ball mill.Keywords: briquetting, chips, impact, rocket engine
Procedia PDF Downloads 52311194 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control
Authors: Ming-Yen Chang, Sheng-Hung Ke
Abstract:
This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride
Procedia PDF Downloads 6611193 Overview and Pathophysiology of Radiation-Induced Breast Changes as a Consequence of Radiotherapy Toxicity
Authors: Monika Rezacova
Abstract:
Radiation-induced breast changes are a consequence of radiotherapy toxicity over the breast tissues either related to targeted breast cancer treatment or other thoracic malignancies (eg. lung cancer). This study has created an overview of different changes and their pathophysiology. The main conditions included were skin thickening, interstitial oedema, fat necrosis, dystrophic calcifications, skin retractions, glandular atrophy, breast fibrosis and radiation induced breast cancer. This study has performed focused literature search through multiple databases including pubmed, medline and embase. The study has reviewed English as well as non English publications. As a result of the literature the study provides comprehensive overview of radiation-induced breast changes and their pathophysiology with small focus on new development and prevention.Keywords: radiotherapy toxicity, breast tissue changes, breast cancer treatment, radiation-induced breast changes
Procedia PDF Downloads 15911192 [Keynote Talk]: Computer-Assisted Language Learning (CALL) for Teaching English to Speakers of Other Languages (TESOL/ESOL) as a Foreign Language (TEFL/EFL), Second Language (TESL/ESL), or Additional Language (TEAL/EAL)
Authors: Andrew Laghos
Abstract:
Computer-assisted language learning (CALL) is defined as the use of computers to help learn languages. In this study we look at several different types of CALL tools and applications and how they can assist Adults and Young Learners in learning the English language as a foreign, second or additional language. It is important to identify the roles of the teacher and the learners, and what the learners’ motivations are for learning the language. Audio, video, interactive multimedia games, online translation services, conferencing, chat rooms, discussion forums, social networks, social media, email communication, songs and music video clips are just some of the many ways computers are currently being used to enhance language learning. CALL may be used for classroom teaching as well as for online and mobile learning. Advantages and disadvantages of CALL are discussed and the study ends with future predictions of CALL.Keywords: computer-assisted language learning (CALL), teaching English as a foreign language (TEFL/EFL), adult learners, young learners
Procedia PDF Downloads 43411191 Topology Optimisation for Reduction in Material Use for Precast Concrete Elements: A Case Study of a 3D-Printed Staircase
Authors: Dengyu You, Alireza Kashani
Abstract:
This study explores the potential of 3D concrete printing in manufacturing prefabricated staircases. The applications of 3D concrete printing in large-scale construction could enhance the industry’s implementation of the Industry 4.0 concept. In addition, the current global challenge is to achieve Net Zero Emissions by 2050. Innovation in the construction industry could potentially speed up achieving this target. The 3D printing technology offers a possible solution that reduces cement usage, minimises framework wastes, and is capable of manufacturing complex structures. The performance of the 3D concrete printed lightweight staircase needs to be evaluated. In this study, the staircase is designed using computer-aided technologies, fabricated by 3D concrete printing technologies, and tested with Australian Standard (AS 1657-2018 Fixed platforms, walkways, stairways, and ladders – design, construction, and installation) under a laboratory environment. The experiment results will be further compared with the FEM analysis. The results indicate that 3D concrete printing is capable of fast production, reducing material usage, and is highly automotive, which meets the industry’s future development goal.Keywords: concrete 3D printing, staircase, sustainability, automation
Procedia PDF Downloads 10511190 Ontology Expansion via Synthetic Dataset Generation and Transformer-Based Concept Extraction
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology expansion, synthetic dataset, transformer fine-tuning, concept extraction, DOLCE, BERT, taxonomy, LLM, NER
Procedia PDF Downloads 1411189 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM
Authors: Rajpal Kaur, Pooja Choudhary
Abstract:
Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM
Procedia PDF Downloads 38411188 When Messages Cause Distraction from Advertising: An Eye-Tracking Study
Authors: Nilamadhab Mohanty
Abstract:
It is essential to use message formats that make communication understandable and correct. It is because; the information format can influence consumer decision on the purchase of a product. This study combines information from qualitative inquiry, media trend analysis, eye tracking experiment, and questionnaire data to examine the impact of specific message format and consumer perceived risk on attention to the information and risk retention. We investigated the influence of message framing (goal framing, attribute framing, and mix framing) on consumer memory, study time, and decisional uncertainty while deciding on the purchase of drugs. Furthermore, we explored the impact of consumer perceived risk (associated with the use of the drug, i.e., RISK-AB and perceived risk associated with the non-use of the drug, i.e., RISK-EB) on message format preference. The study used eye-tracking methods to understand the differences in message processing. Findings of the study suggest that the message format influences information processing, and participants' risk perception impacts message format preference. Eye tracking can be used to understand the format differences and design effective advertisements.Keywords: message framing, consumer perceived risk, advertising, eye tracking
Procedia PDF Downloads 12211187 A Strategy for Reducing Dynamic Disorder in Small Molecule Organic Semiconductors by Suppressing Large Amplitude Thermal Motions
Authors: Steffen Illig, Alexander S. Eggeman, Alessandro Troisi, Stephen G. Yeates, John E. Anthony, Henning Sirringhaus
Abstract:
Large-amplitude intermolecular vibrations in combination with complex shaped transfer integrals generate a thermally fluctuating energetic landscape. The resulting dynamic disorder and its intrinsic presence in organic semiconductors is one of the most fundamental differences to their inorganic counterparts. Dynamic disorder is believed to govern many of the unique electrical and optical properties of organic systems. However, the low energy nature of these vibrations makes it difficult to access them experimentally and because of this we still lack clear molecular design rules to control and reduce dynamic disorder. Applying a novel technique based on electron diffraction we encountered strong intermolecular, thermal vibrations in every single organic material we studied (14 up to date), indicating that a large degree of dynamic disorder is a universal phenomenon in organic crystals. In this paper a new molecular design strategy will be presented to avoid dynamic disorder. We found that small molecules that have their side chains attached to the long axis of their conjugated core have been found to be less likely to suffer from dynamic disorder effects. In particular, we demonstrate that 2,7-dioctyl[1]benzothieno[3,2-b][1]benzothio-phene (C8-BTBT) and 2,9-di-decyl-dinaphtho-[2,3-b:20,30-f]-thieno-[3,2-b]-thiophene (C10DNTT) exhibit strongly reduced thermal vibrations in comparison to other molecules and relate their outstanding performance to their lower dynamic disorder. We rationalize the low degree of dynamic disorder in C8-BTBT and C10-DNTT with a better encapsulation of the conjugated cores in the crystal structure which helps reduce large amplitude thermal motions. The work presented in this paper provides a general strategy for the design of new classes of very high mobility organic semiconductors with low dynamic disorder.Keywords: charge transport, C8-BTBT, C10-DNTT, dynamic disorder, organic semiconductors, thermal vibrations
Procedia PDF Downloads 39911186 Calibration of the Discrete Element Method Using a Large Shear Box
Authors: C. J. Coetzee, E. Horn
Abstract:
One of the main challenges in using the Discrete Element Method (DEM) is to specify the correct input parameter values. In general, the models are sensitive to the input parameter values and accurate results can only be achieved if the correct values are specified. For the linear contact model, micro-parameters such as the particle density, stiffness, coefficient of friction, as well as the particle size and shape distributions are required. There is a need for a procedure to accurately calibrate these parameters before any attempt can be made to accurately model a complete bulk materials handling system. Since DEM is often used to model applications in the mining and quarrying industries, a calibration procedure was developed for materials that consist of relatively large (up to 40 mm in size) particles. A coarse crushed aggregate was used as the test material. Using a specially designed large shear box with a diameter of 590 mm, the confined Young’s modulus (bulk stiffness) and internal friction angle of the material were measured by means of the confined compression test and the direct shear test respectively. DEM models of the experimental setup were developed and the input parameter values were varied iteratively until a close correlation between the experimental and numerical results was achieved. The calibration process was validated by modelling the pull-out of an anchor from a bed of material. The model results compared well with experimental measurement.Keywords: Discrete Element Method (DEM), calibration, shear box, anchor pull-out
Procedia PDF Downloads 291