Search results for: Analytic Hierarchy Processing (AHP)
2041 Residual Modulus of Elasticity of Self-Compacting Concrete Incorporated Unprocessed Waste Fly Ash after Expose to the Elevated Temperature
Authors: Mohammed Abed, Rita Nemes, Salem Nehme
Abstract:
The present study experimentally investigated the impact of incorporating unprocessed waste fly ash (UWFA) on the residual mechanical properties of self-compacting concrete (SCC) after exposure to elevated temperature. Three mixtures of SCC have been produced by replacing the cement mass by 0%, 15% and 30% of UWFA. Generally, the fire resistance of SCC has been enhanced by replacing the cement up to 15% of UWFA, especially in case of residual modulus of elasticity which considers more sensitive than other mechanical properties at elevated temperature. However, a strong linear relationship has been observed between the residual flexural strength and modulus of elasticity, where both of them affected significantly by the cracks appearance and propagation as a result of elevated temperature. Sustainable products could be produced by incorporating unprocessed waste powder materials in the production of concrete, where the waste materials, CO2 emissions, and the energy needed for processing are reduced.Keywords: self-compacting high-performance concrete, unprocessed waste fly ash, fire resistance, residual modulus of elasticity
Procedia PDF Downloads 1342040 Characterization and Degradation Analysis of Tapioca Starch Based Biofilms
Authors: R. R. Ali, W. A. W. A. Rahman, R. M. Kasmani, H. Hasbullah, N. Ibrahim, A. N. Sadikin, U. A. Asli
Abstract:
In this study, tapioca starch which acts as natural polymer was added in the blend in order to produce biodegradable product. Low density polyethylene (LDPE) and tapioca starch blends were prepared by extrusion and the test sample by injection moulding process. Ethylene vinyl acetate (EVA) acts as compatibilizer while glycerol as processing aid was added in the blend. The blends were characterized by using melt flow index (MFI), fourier transform infrared (FTIR) and the effects of water absorption to the sample. As the starch content increased, MFI of the blend was decreased. Tensile testing were conducted shows the tensile strength and elongation at break decreased while the modulus increased as the starch increased. For the biodegradation, soil burial test was conducted and the loss in weight was studied as the starch content increased. Morphology studies were conducted in order to show the distribution between LDPE and starch.Keywords: biopolymers, degradable polymers, starch based polyethylene, injection moulding
Procedia PDF Downloads 2852039 Efficient Layout-Aware Pretraining for Multimodal Form Understanding
Authors: Armineh Nourbakhsh, Sameena Shah, Carolyn Rose
Abstract:
Layout-aware language models have been used to create multimodal representations for documents that are in image form, achieving relatively high accuracy in document understanding tasks. However, the large number of parameters in the resulting models makes building and using them prohibitive without access to high-performing processing units with large memory capacity. We propose an alternative approach that can create efficient representations without the need for a neural visual backbone. This leads to an 80% reduction in the number of parameters compared to the smallest SOTA model, widely expanding applicability. In addition, our layout embeddings are pre-trained on spatial and visual cues alone and only fused with text embeddings in downstream tasks, which can facilitate applicability to low-resource of multi-lingual domains. Despite using 2.5% of training data, we show competitive performance on two form understanding tasks: semantic labeling and link prediction.Keywords: layout understanding, form understanding, multimodal document understanding, bias-augmented attention
Procedia PDF Downloads 1472038 Understanding Beginning Writers' Narrative Writing with a Multidimensional Assessment Approach
Authors: Huijing Wen, Daibao Guo
Abstract:
Writing is thought to be the most complex facet of language arts. Assessing writing is difficult and subjective, and there are few scientifically validated assessments exist. Research has proposed evaluating writing using a multidimensional approach, including both qualitative and quantitative measures of handwriting, spelling and prose. Given that narrative writing has historically been a staple of literacy instruction in primary grades and is one of the three major genres Common Core State Standards required students to acquire starting in kindergarten, it is essential for teachers to understand how to measure beginning writers writing development and sources of writing difficulties through narrative writing. Guided by the theoretical models of early written expression and using empirical data, this study examines ways teachers can enact a comprehensive approach to understanding beginning writer’s narrative writing through three writing rubrics developed for a Curriculum-based Measurement (CBM). The goal is to help classroom teachers structure a framework for assessing early writing in primary classrooms. Participants in this study included 380 first-grade students from 50 classrooms in 13 schools in three school districts in a Mid-Atlantic state. Three writing tests were used to assess first graders’ writing skills in relation to both transcription (i.e., handwriting fluency and spelling tests) and translational skills (i.e., a narrative prompt). First graders were asked to respond to a narrative prompt in 20 minutes. Grounded in theoretical models of earlier expression and empirical evidence of key contributors to early writing, all written samples to the narrative prompt were coded three ways for different dimensions of writing: length, quality, and genre elements. To measure the quality of the narrative writing, a traditional holistic rating rubric was developed by the researchers based on the CCSS and the general traits of good writing. Students' genre knowledge was measured by using a separate analytic rubric for narrative writing. Findings showed that first-graders had emerging and limited transcriptional and translational skills with a nascent knowledge of genre conventions. The findings of the study provided support for the Not-So-Simple View of Writing in that fluent written expression, measured by length and other important linguistic resources measured by the overall quality and genre knowledge rubrics, are fundamental in early writing development. Our study echoed previous research findings on children's narrative development. The study has practical classroom application as it informs writing instruction and assessment. It offered practical guidelines for classroom instruction by providing teachers with a better understanding of first graders' narrative writing skills and knowledge of genre conventions. Understanding students’ narrative writing provides teachers with more insights into specific strategies students might use during writing and their understanding of good narrative writing. Additionally, it is important for teachers to differentiate writing instruction given the individual differences shown by our multiple writing measures. Overall, the study shed light on beginning writers’ narrative writing, indicating the complexity of early writing development.Keywords: writing assessment, early writing, beginning writers, transcriptional skills, translational skills, primary grades, simple view of writing, writing rubrics, curriculum-based measurement
Procedia PDF Downloads 732037 Structural Characterization and Hot Deformation Behaviour of Al3Ni2/Al3Ni in-situ Core-shell intermetallic in Al-4Cu-Ni Composite
Authors: Ganesh V., Asit Kumar Khanra
Abstract:
An in-situ powder metallurgy technique was employed to create Ni-Al3Ni/Al3Ni2 core-shell-shaped aluminum-based intermetallic reinforced composites. The impact of Ni addition on the phase composition, microstructure, and mechanical characteristics of the Al-4Cu-xNi (x = 0, 2, 4, 6, 8, 10 wt.%) in relation to various sintering temperatures was investigated. Microstructure evolution was extensively examined using X-ray diffraction (XRD), scanning electron microscopy with energy-dispersive X-ray spectroscopy (SEM-EDX), and transmission electron microscopy (TEM) techniques. Initially, under sintering conditions, the formation of "Single Core-Shell" structures was observed, consisting of Ni as the core with Al3Ni2 intermetallic, whereas samples sintered at 620°C exhibited both "Single Core-Shell" and "Double Core-Shell" structures containing Al3Ni2 and Al3Ni intermetallics formed between the Al matrix and Ni reinforcements. The composite achieved a high compressive yield strength of 198.13 MPa and ultimate strength of 410.68 MPa, with 24% total elongation for the sample containing 10 wt.% Ni. Additionally, there was a substantial increase in hardness, reaching 124.21 HV, which is 2.4 times higher than that of the base aluminum. Nanoindentation studies showed hardness values of 1.54, 4.65, 21.01, 13.16, 5.52, 6.27, and 8.39GPa corresponding to α-Al matrix, Ni, Al3Ni2, Ni and Al3Ni2 interface, Al3Ni, and their respective interfaces. Even at 200°C, it retained 54% of its room temperature strength (90.51 MPa). To investigate the deformation behavior of the composite material, experiments were conducted at deformation temperatures ranging from 300°C to 500°C, with strain rates varying from 0.0001s-1 to 0.1s-1. A sine-hyperbolic constitutive equation was developed to characterize the flow stress of the composite, which exhibited a significantly higher hot deformation activation energy of 231.44 kJ/mol compared to the self-diffusion of pure aluminum. The formation of Al2Cu intermetallics at grain boundaries and Al3Ni2/Al3Ni within the matrix hindered dislocation movement, leading to an increase in activation energy, which might have an adverse effect on high-temperature applications. Two models, the Strain-compensated Arrhenius model and the Artificial Neural Network (ANN) model, were developed to predict the composite's flow behavior. The ANN model outperformed the Strain-compensated Arrhenius model with a lower average absolute relative error of 2.266%, a smaller root means square error of 1.2488 MPa, and a higher correlation coefficient of 0.9997. Processing maps revealed that the optimal hot working conditions for the composite were in the temperature range of 420-500°C and strain rates between 0.0001s-1 and 0.001s-1. The changes in the composite microstructure were successfully correlated with the theory of processing maps, considering temperature and strain rate conditions. The uneven distribution in the shape and size of Core-shell/Al3Ni intermetallic compounds influenced the flow stress curves, leading to Dynamic Recrystallization (DRX), followed by partial Dynamic Recovery (DRV), and ultimately strain hardening. This composite material shows promise for applications in the automobile and aerospace industries.Keywords: core-shell structure, hot deformation, intermetallic compounds, powder metallurgy
Procedia PDF Downloads 172036 A Constructionist View of Projects, Social Media and Tacit Knowledge in a College Classroom: An Exploratory Study
Authors: John Zanetich
Abstract:
Designing an educational activity that encourages inquiry and collaboration is key to engaging students in meaningful learning. Educational Information and Communications Technology (EICT) plays an important role in facilitating cooperative and collaborative learning in the classroom. The EICT also facilitates students’ learning and development of the critical thinking skills needed to solve real world problems. Projects and activities based on constructivism encourage students to embrace complexity as well as find relevance and joy in their learning. It also enhances the students’ capacity for creative and responsible real-world problem solving. Classroom activities based on constructivism offer students an opportunity to develop the higher–order-thinking skills of defining problems and identifying solutions. Participating in a classroom project is an activity for both acquiring experiential knowledge and applying new knowledge to practical situations. It also provides an opportunity for students to integrate new knowledge into a skill set using reflection. Classroom projects can be developed around a variety of learning objects including social media, knowledge management and learning communities. The construction of meaning through project-based learning is an approach that encourages interaction and problem-solving activities. Projects require active participation, collaboration and interaction to reach the agreed upon outcomes. Projects also serve to externalize the invisible cognitive and social processes taking place in the activity itself and in the student experience. This paper describes a classroom project designed to elicit interactions by helping students to unfreeze existing knowledge, to create new learning experiences, and then refreeze the new knowledge. Since constructivists believe that students construct their own meaning through active engagement and participation as well as interactions with others. knowledge management can be used to guide the exchange of both tacit and explicit knowledge in interpersonal interactions between students and guide the construction of meaning. This paper uses an action research approach to the development of a classroom project and describes the use of technology, social media and the active use of tacit knowledge in the college classroom. In this project, a closed group Facebook page becomes the virtual classroom where interaction is captured and measured using engagement analytics. In the virtual learning community, the principles of knowledge management are used to identify the process and components of the infrastructure of the learning process. The project identifies class member interests and measures student engagement in a learning community by analyzing regular posting on the Facebook page. These posts are used to foster and encourage interactions, reflect a student’s interest and serve as reaction points from which viewers of the post convert the explicit information in the post to implicit knowledge. The data was collected over an academic year and was provided, in part, by the Google analytic reports on Facebook and self-reports of posts by members. The results support the use of active tacit knowledge activities, knowledge management and social media to enhance the student learning experience and help create the knowledge that will be used by students to construct meaning.Keywords: constructivism, knowledge management, tacit knowledge, social media
Procedia PDF Downloads 2142035 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images
Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar
Abstract:
Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.Keywords: diabetic retinopathy, fundus, CHT, exudates, hemorrhages
Procedia PDF Downloads 2712034 Monocular 3D Person Tracking AIA Demographic Classification and Projective Image Processing
Authors: McClain Thiel
Abstract:
Object detection and localization has historically required two or more sensors due to the loss of information from 3D to 2D space, however, most surveillance systems currently in use in the real world only have one sensor per location. Generally, this consists of a single low-resolution camera positioned above the area under observation (mall, jewelry store, traffic camera). This is not sufficient for robust 3D tracking for applications such as security or more recent relevance, contract tracing. This paper proposes a lightweight system for 3D person tracking that requires no additional hardware, based on compressed object detection convolutional-nets, facial landmark detection, and projective geometry. This approach involves classifying the target into a demographic category and then making assumptions about the relative locations of facial landmarks from the demographic information, and from there using simple projective geometry and known constants to find the target's location in 3D space. Preliminary testing, although severely lacking, suggests reasonable success in 3D tracking under ideal conditions.Keywords: monocular distancing, computer vision, facial analysis, 3D localization
Procedia PDF Downloads 1372033 Global Winners versus Local Losers: Globalization Identity and Tradition in Spanish Club Football
Authors: Jim O'brien
Abstract:
Contemporary global representation and consumption of La Liga across a plethora of media platform outlets has resulted in significant implications for the historical, political and cultural developments which shaped the development of Spanish club football. This has established and reinforced a hierarchy of a small number of teams belonging to or aspiring to belong to a cluster of global elite clubs seeking to imitate the blueprint of the English Premier League in respect of corporate branding and marketing in order to secure a global fan base through success and exposure in La Liga itself and through the Champions League. The synthesis between globalization, global sport and the status of high profile clubs has created radical change within the folkloric iconography of Spanish football. The main focus of this paper is to critically evaluate the consequences of globalization on the rich tapestry at the core of the game’s distinctive history in Spain. The seminal debate underpinning the study considers whether the divergent aspects of globalization have acted as a malevolent force, eroding tradition, causing financial meltdown and reducing much of the fabric of club football to the status of by standers, or have promoted a renaissance of these traditions, securing their legacies through new fans and audiences. The study draws on extensive sources on the history, politics and culture of Spanish football, in both English and Spanish. It also uses primary and archive material derived from interviews and fieldwork undertaken with scholars, media professionals and club representatives in Spain. The paper has four main themes. Firstly, it contextualizes the key historical, political and cultural forces which shaped the landscape of Spanish football from the late nineteenth century. The seminal notions of region, locality and cultural divergence are pivotal to this discourse. The study then considers the relationship between football, ethnicity and identity as a barometer of continuity and change, suggesting that tradition is being reinvented and re-framed to reflect the shifting demographic and societal patterns within the Spanish state. Following on from this, consideration is given to the paradoxical function of ‘El Clasico’ and the dominant duopoly of the FC Barcelona – Real Madrid axis in both eroding tradition in the global nexus of football’s commodification and in protecting historic political rivalries. To most global consumers of La Liga, the mega- spectacle and hyperbole of ‘El Clasico’ is the essence of Spanish football, with cultural misrepresentation and distortion catapulting the event to the global media audience. Finally, the paper examines La Liga as a sporting phenomenon in which elite clubs, cult managers and galacticos serve as commodities on the altar of mass consumption in football’s global entertainment matrix. These processes accentuate a homogenous mosaic of cultural conformity which obscures local, regional and national identities and paradoxically fuses the global with the local to maintain the distinctive hue of La Liga, as witnessed by the extraordinary successes of Athletico Madrid and FC Eibar in recent seasons.Keywords: Spanish football, globalization, cultural identity, tradition, folklore
Procedia PDF Downloads 3002032 Using a Robot Companion to Detect and Visualize the Indicators of Dementia Progression and Quality of Life of People Aged 65 and Older
Authors: Jeoffrey Oostrom, Robbert James Schlingmann, Hani Alers
Abstract:
This document depicts the research into the indicators of dementia progression, the automation of quality of life assignments, and the visualization of it. To do this, the Smart Teddy project was initiated to make a smart companion that both monitors the senior citizen as well as processing the captured data into an insightful dashboard. With around 50 million diagnoses worldwide, dementia proves again and again to be a bothersome strain on the lives of many individuals, their relatives, and society as a whole. In 2015 it was estimated that dementia care cost 818 billion U.S Dollars globally. The Smart Teddy project aims to take away a portion of the burden from caregivers by automating the collection of certain data, like movement, geolocation, and sound-levels. This paper proves that the Smart Teddy has the potential to become a useful tool for caregivers but won’t pose as a solution. The Smart Teddy still faces some problems in terms of emotional privacy, but its non-intrusive nature, as well as diversity in usability, can make up for it.Keywords: dementia care, medical data visualization, quality of life, smart companion
Procedia PDF Downloads 1372031 Development of a Wind Resource Assessment Framework Using Weather Research and Forecasting (WRF) Model, Python Scripting and Geographic Information Systems
Authors: Jerome T. Tolentino, Ma. Victoria Rejuso, Jara Kaye Villanueva, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang
Abstract:
Wind energy is rapidly emerging as the primary source of electricity in the Philippines, although developing an accurate wind resource model is difficult. In this study, Weather Research and Forecasting (WRF) Model, an open source mesoscale Numerical Weather Prediction (NWP) model, was used to produce a 1-year atmospheric simulation with 4 km resolution on the Ilocos Region of the Philippines. The WRF output (netCDF) extracts the annual mean wind speed data using a Python-based Graphical User Interface. Lastly, wind resource assessment was produced using a GIS software. Results of the study showed that it is more flexible to use Python scripts than using other post-processing tools in dealing with netCDF files. Using WRF Model, Python, and Geographic Information Systems, a reliable wind resource map is produced.Keywords: wind resource assessment, weather research and forecasting (WRF) model, python, GIS software
Procedia PDF Downloads 4402030 Simulation Study of the Microwave Heating of the Hematite and Coal Mixture
Authors: Prasenjit Singha, Sunil Yadav, Soumya Ranjan Mohantry, Ajay Kumar Shukla
Abstract:
Temperature distribution in the hematite ore mixed with 7.5% coal was predicted by solving a 1-D heat conduction equation using an implicit finite difference approach. In this work, it was considered a square slab of 20 cm x 20 cm, which assumed the coal to be uniformly mixed with hematite ore. It was solved the equations with the use of MATLAB 2018a software. Heat transfer effects in this 1D dimensional slab convective and the radiative boundary conditions are also considered. Temperature distribution obtained inside hematite slab by considering microwave heating time, thermal conductivity, heat capacity, carbon percentage, sample dimensions, and many other factors such as penetration depth, permittivity, and permeability of coal and hematite ore mixtures. The resulting temperature profile can be used as a guiding tool for optimizing the microwave-assisted carbothermal reduction process of hematite slab was extended to other dimensions as well, viz., 1 cm x 1 cm, 5 cm x 5 cm, 10 cm x 10 cm, 20 cm x 20 cm. The model predictions are in good agreement with experimental results.Keywords: hematite ore, coal, microwave processing, heat transfer, implicit method, temperature distribution
Procedia PDF Downloads 1652029 Early Detection of Lymphedema in Post-Surgery Oncology Patients
Authors: Sneha Noble, Rahul Krishnan, Uma G., D. K. Vijaykumar
Abstract:
Breast-Cancer related Lymphedema is a major problem that affects many women. Lymphedema is the swelling that generally occurs in the arms or legs caused by the removal of or damage to lymph nodes as a part of cancer treatment. Treating it at the earliest possible stage is the best way to manage the condition and prevent it from leading to pain, recurrent infection, reduced mobility, and impaired function. So, this project aims to focus on the multi-modal approaches to identify the risks of Lymphedema in post-surgical oncology patients and prevent it at the earliest. The Kinect IR Sensor is utilized to capture the images of the body and after image processing techniques, the region of interest is obtained. Then, performing the voxelization method will provide volume measurements in pre-operative and post-operative periods in patients. The formation of a mathematical model will help in the comparison of values. Clinical pathological data of patients will be investigated to assess the factors responsible for the development of lymphedema and its risks.Keywords: Kinect IR sensor, Lymphedema, voxelization, lymph nodes
Procedia PDF Downloads 1372028 Getting to Know the Types of Asphalt, Its Manufacturing and Processing Methods and Its Application in Road Construction
Authors: Hamid Fallah
Abstract:
Asphalt is generally a mixture of stone materials with continuous granulation and a binder, which is usually bitumen. Asphalt is made in different shapes according to its use. The most familiar type of asphalt is hot asphalt or hot asphalt concrete. Stone materials usually make up more than 90% of the asphalt mixture. Therefore, stone materials have a significant impact on the quality of the resulting asphalt. According to the method of application and mixing, asphalt is divided into three categories: hot asphalt, protective asphalt, and cold asphalt. Cold mix asphalt is a mixture of stone materials and mixed bitumen or bitumen emulsion whose raw materials are mixed at ambient temperature. In some types of cold asphalt, the bitumen may be heated as necessary, but other materials are mixed with the bitumen without heating. Protective asphalts are used to make the roadbed impermeable, increase its abrasion and sliding resistance, and also temporarily improve the existing asphalt and concrete surfaces. This type of paving is very economical compared to hot asphalt due to the speed and ease of implementation and the limited need for asphalt machines and equipment. The present article, which is prepared in descriptive library form, introduces asphalt, its types, characteristics, and its application.Keywords: asphalt, type of asphalt, asphalt concrete, sulfur concrete, bitumen in asphalt, sulfur, stone materials
Procedia PDF Downloads 652027 Municipalities as Enablers of Citizen-Led Urban Initiatives: Possibilities and Constraints
Authors: Rosa Nadine Danenberg
Abstract:
In recent years, bottom-up urban development has started growing as an alternative to conventional top-down planning. In large proportions, citizens and communities initiate small-scale interventions; suddenly seeming to form a trend. As a result, more and more cities are witnessing not only the growth of but also an interest in these initiatives, as they bear the potential to reshape urban spaces. Such alternative city-making efforts cause new dynamics in urban governance, with inevitable consequences for the controlled city planning and its administration. The emergence of enabling relationships between top-down and bottom-up actors signals an increasingly common urban practice. Various case studies show that an enabling relationship is possible, yet, how it can be optimally realized stays rather underexamined. Therefore, the seemingly growing worldwide phenomenon of ‘municipal bottom-up urban development’ necessitates an adequate governance structure. As such, the aim of this research is to contribute knowledge to how municipalities can enable citizen-led urban initiatives from a governance innovation perspective. Empirical case-study research in Stockholm and Istanbul, derived from interviews with founders of four citizen-led urban initiatives and one municipal representative in each city, provided valuable insights to possibilities and constraints for enabling practices. On the one hand, diverging outcomes emphasize the extreme oppositional features of both cases (Stockholm and Istanbul). Firstly, both cities’ characteristics are drastically different. Secondly, the ideologies and motifs for the initiatives to emerge vary widely. Thirdly, the major constraints for citizen-led urban initiatives to relate to the municipality are considerably different. Two types of municipality’s organizational structures produce different underlying mechanisms which demonstrate the constraints. The first municipal organizational structure is steered by bureaucracy (Stockholm). It produces an administrative division that brings up constraints such as the lack of responsibility, transparency and continuity by municipal representatives. The second structure is dominated by municipal politics and governmental hierarchy (Istanbul). It produces informality, lack of transparency and a fragmented civil society. In order to cope with the constraints produced by both types of organizational structures, the initiatives have adjusted their organization to the municipality’s underlying structures. On the other hand, this paper has in fact also come to a rather unifying conclusion. Interestingly, the suggested possibilities for an enabling relationship underline converging new urban governance arrangements. This could imply that for the two varying types of municipality’s organizational structures there is an accurate governance structure. Namely, the combination of a neighborhood council with a municipal guide, with allowance for the initiatives to adopt a politicizing attitude is found as coinciding. Especially its combination appears key to redeem varying constraints. A municipal guide steers the initiatives through bureaucratic struggles, is supported by coproduction methods, while it balances out municipal politics. Next, a neighborhood council, that is politically neutral and run by local citizens, can function as an umbrella for citizen-led urban initiatives. What is crucial is that it should cater for a more entangled relationship between municipalities and initiatives with enhanced involvement of the initiatives in decision-making processes and limited involvement of prevailing constraints pointed out in this research.Keywords: bottom-up urban development, governance innovation, Istanbul, Stockholm
Procedia PDF Downloads 2172026 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers
Authors: Oumaima Lahmar
Abstract:
This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.Keywords: finance literature, textual analysis, topic modeling, perplexity
Procedia PDF Downloads 1692025 Artificial Intelligence for Safety Related Aviation Incident and Accident Investigation Scenarios
Authors: Bernabeo R. Alberto
Abstract:
With the tremendous improvements in the processing power of computers, the possibilities of artificial intelligence will increasingly be used in aviation and make autonomous flights, preventive maintenance, ATM (Air Traffic Management) optimization, pilots, cabin crew, ground staff, and airport staff training possible in a cost-saving, less time-consuming and less polluting way. Through the use of artificial intelligence, we foresee an interviewing scenario where the interviewee will interact with the artificial intelligence tool to contextualize the character and the necessary information in a way that aligns reasonably with the character and the scenario. We are creating simulated scenarios connected with either an aviation incident or accident to enhance also the training of future accident/incident investigators integrating artificial intelligence and augmented reality tools. The project's goal is to improve the learning and teaching scenario through academic and professional expertise in aviation and in the artificial intelligence field. Thus, we intend to contribute to the needed high innovation capacity, skills, and training development and management of artificial intelligence, supported by appropriate regulations and attention to ethical problems.Keywords: artificial intelligence, aviation accident, aviation incident, risk, safety
Procedia PDF Downloads 202024 A Review on the Adoption and Acculturation of Digital Technologies among Farmers of Haryana State
Authors: Manisha Ohlan, Manju Dahiya
Abstract:
The present study was conducted in Karnal, Rohtak, and Jhajjar districts of Haryana state, covering 360 respondents. Results showed that 42.78 percent of the respondents had above average knowledge at the preparation stage followed by 48.33 percent of the respondents who had high knowledge at the production stage, and 37.22 percent of the respondents had average knowledge at the processing stage regarding the usage of digital technologies. Nearly half of the respondents (47.50%) agreed with the usage of digital technologies, followed by strongly agreed (19.45%) and strongly disagreed (14.45%). A significant and positive relationship was found between independent variables and knowledge and of digital technologies at 5 percent level of significance. Therefore, the null hypothesis cannot be rejected. All the dependent variables, including knowledge and attitude, had a significant and positive relationship with z value at 5 percent level of significance, which showed that it is between -1.96 to +1.96; therefore, the data falls between the acceptance region, that’s why the null hypothesis is accepted.Keywords: knowledge, attitude, digital technologies, significant, positive relationship
Procedia PDF Downloads 922023 Design of a Controlled BHJ Solar Cell Using Modified Organic Vapor Spray Deposition Technique
Authors: F. Stephen Joe, V. Sathya Narayanan, V. R. Sanal Kumar
Abstract:
A comprehensive review of the literature on photovoltaic cells has been carried out for exploring the better options for cost efficient technologies for future solar cell applications. Literature review reveals that the Bulk Heterojunction (BHJ) Polymer Solar cells offer special opportunities as renewable energy resources. It is evident from the previous studies that the device fabricated with TiOx layer shows better power conversion efficiency than that of the device without TiOx layer. In this paper, authors designed a controlled BHJ solar cell using a modified organic vapor spray deposition technique facilitated with a vertical-moving gun named as 'Stephen Joe Technique' for getting a desirable surface pattern over the substrate to improving its efficiency over the years for industrial applications. We comprehended that the efficient processing and the interface engineering of these solar cells could increase the efficiency up to 5-10 %.Keywords: BHJ polymer solar cell, photovoltaic cell, solar cell, Stephen Joe technique
Procedia PDF Downloads 5412022 Geographic Information System for Simulating Air Traffic By Applying Different Multi-Radar Positioning Techniques
Authors: Amara Rafik, Mostefa Belhadj Aissa
Abstract:
Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.Keywords: ATM, GIS, radar data, simulation
Procedia PDF Downloads 1162021 Enhancing Code Security with AI-Powered Vulnerability Detection
Authors: Zzibu Mark Brian
Abstract:
As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.Keywords: AI, machine language, cord security, machine leaning
Procedia PDF Downloads 352020 Genodata: The Human Genome Variation Using BigData
Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta
Abstract:
Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop
Procedia PDF Downloads 2582019 Email Phishing Detection Using Natural Language Processing and Convolutional Neural Network
Abstract:
Phishing is one of the oldest and best known scams on the Internet. It can be defined as any type of telecommunications fraud that uses social engineering tricks to obtain confidential data from its victims. It’s a cybercrime aimed at stealing your sensitive information. Phishing is generally done via private email, so scammers impersonate large companies or other trusted entities to encourage victims to voluntarily provide information such as login credentials or, worse yet, credit card numbers. The COVID-19 theme is used by cybercriminals in multiple malicious campaigns like phishing. In this environment, messaging filtering solutions have become essential to protect devices that will now be used outside of the secure perimeter. Despite constantly updating methods to avoid these cyberattacks, the end result is currently insufficient. Many researchers are looking for optimal solutions to filter phishing emails, but we still need good results. In this work, we concentrated on solving the problem of detecting phishing emails using the different steps of NLP preprocessing, and we proposed and trained a model using one-dimensional CNN. Our study results show that our model obtained an accuracy of 99.99%, which demonstrates how well our model is working.Keywords: phishing, e-mail, NLP preprocessing, CNN, e-mail filtering
Procedia PDF Downloads 1242018 The Grammatical Dictionary Compiler: A System for Kartvelian Languages
Authors: Liana Lortkipanidze, Nino Amirezashvili, Nino Javashvili
Abstract:
The purpose of the grammatical dictionary is to provide information on the morphological and syntactic characteristics of the basic word in the dictionary entry. The electronic grammatical dictionaries are used as a tool of automated morphological analysis for texts processing. The Georgian Grammatical Dictionary should contain grammatical information for each word: part of speech, type of declension/conjugation, grammatical forms of the word (paradigm), alternative variants of basic word/lemma. In this paper, we present the system for compiling the Georgian Grammatical Dictionary automatically. We propose dictionary-based methods for extending grammatical lexicons. The input lexicon contains only a few number of words with identical grammatical features. The extension is based on similarity measures between features of words; more precisely, we add words to the extended lexicons, which are similar to those, which are already in the grammatical dictionary. Our dictionaries are corpora-based, and for the compiling, we introduce the method for lemmatization of unknown words, i.e., words of which neither full form nor lemma is in the grammatical dictionary.Keywords: acquisition of lexicon, Georgian grammatical dictionary, lemmatization rules, morphological processor
Procedia PDF Downloads 1422017 Small Text Extraction from Documents and Chart Images
Authors: Rominkumar Busa, Shahira K. C., Lijiya A.
Abstract:
Text recognition is an important area in computer vision which deals with detecting and recognising text from an image. The Optical Character Recognition (OCR) is a saturated area these days and with very good text recognition accuracy. However the same OCR methods when applied on text with small font sizes like the text data of chart images, the recognition rate is less than 30%. In this work, aims to extract small text in images using the deep learning model, CRNN with CTC loss. The text recognition accuracy is found to improve by applying image enhancement by super resolution prior to CRNN model. We also observe the text recognition rate further increases by 18% by applying the proposed method, which involves super resolution and character segmentation followed by CRNN with CTC loss. The efficiency of the proposed method shows that further pre-processing on chart image text and other small text images will improve the accuracy further, thereby helping text extraction from chart images.Keywords: small text extraction, OCR, scene text recognition, CRNN
Procedia PDF Downloads 1222016 Design of Wireless Readout System for Resonant Gas Sensors
Authors: S. Mohamed Rabeek, Mi Kyoung Park, M. Annamalai Arasu
Abstract:
This paper presents a design of a wireless read out system for tracking the frequency shift of the polymer coated piezoelectric micro electromechanical resonator due to gas absorption. The measure of this frequency shift indicates the percentage of a particular gas the sensor is exposed to. It is measured using an oscillator and an FPGA based frequency counter by employing the resonator as a frequency determining element in the oscillator. This system consists of a Gas Sensing Wireless Readout (GSWR) and an USB Wireless Transceiver (UWT). GSWR consists of an oscillator based on a trans-impedance sustaining amplifier, an FPGA based frequency readout, a sub 1GHz wireless transceiver and a micro controller. UWT can be plugged into the computer via USB port and function as a wireless module to transfer gas sensor data from GSWR to the computer through its USB port. GUI program running on the computer periodically polls for sensor data through UWT - GSWR wireless link, the response from GSWR is logged in a file for post processing as well as displayed on screen.Keywords: gas sensor, GSWR, micromechanical system, UWT, volatile emissions
Procedia PDF Downloads 4822015 Multifunctional Nanofiber Based Aerogels: Bridging Electrospinning with Aerogel Fabrication
Authors: Tahira Pirzada, Zahra Ashrafi, Saad Khan
Abstract:
We present a facile and sustainable solid templating approach to fabricate highly porous, flexible and superhydrophobic aerogels of composite nanofibers of cellulose diacetate and silica which are produced through sol gel electrospinning. Scanning electron microscopy, contact angle measurement, and attenuated total reflection-Fourier transform infrared spectrometry are used to understand the structural features of the resultant aerogels while thermogravimetric analysis and differential scanning calorimetry demonstrate their thermal stability. These aerogels exhibit a self-supportive three-dimensional network abundant in large secondary pores surrounded by primary pores resulting in a highly porous structure. Thermal crosslinking of the aerogels has further stabilized their structure and flexibility without compromising on the porosity. Ease of processing, thermal stability, high porosity and oleophilic nature of these aerogels make them promising candidate for a wide variety of applications including acoustic and thermal insulation and oil and water separation.Keywords: hybrid aerogels, sol-gel electrospinning, oil-water separation, nanofibers
Procedia PDF Downloads 1562014 Semantic Textual Similarity on Contracts: Exploring Multiple Negative Ranking Losses for Sentence Transformers
Authors: Yogendra Sisodia
Abstract:
Researchers are becoming more interested in extracting useful information from legal documents thanks to the development of large-scale language models in natural language processing (NLP), and deep learning has accelerated the creation of powerful text mining models. Legal fields like contracts benefit greatly from semantic text search since it makes it quick and easy to find related clauses. After collecting sentence embeddings, it is relatively simple to locate sentences with a comparable meaning throughout the entire legal corpus. The author of this research investigated two pre-trained language models for this task: MiniLM and Roberta, and further fine-tuned them on Legal Contracts. The author used Multiple Negative Ranking Loss for the creation of sentence transformers. The fine-tuned language models and sentence transformers showed promising results.Keywords: legal contracts, multiple negative ranking loss, natural language inference, sentence transformers, semantic textual similarity
Procedia PDF Downloads 1052013 Multiobjective Optimization of Wastwater Treatment by Electrochemical Process
Authors: Malek Bendjaballah, Hacina Saidi, Sarra Hamidoud
Abstract:
The aim of this study is to model and optimize the performance of a new electrocoagulation (E.C) process for the treatment of wastewater as well as the energy consumption in order to extrapolate it to the industrial scale. Through judicious application of an experimental design (DOE), it has been possible to evaluate the individual effects and interactions that have a significant influence on both objective functions (maximizing efficiency and minimizing energy consumption) by using aluminum electrodes as sacrificial anode. Preliminary experiments have shown that the pH of the medium, the applied potential and the treatment time with E.C are the main parameters. A factorial design 33 has been adopted to model performance and energy consumption. Under optimal conditions, the pollution reduction efficiency is 93%, combined with a minimum energy consumption of 2.60.10-3 kWh / mg-COD. The potential or current applied and the processing time and their interaction were the most influential parameters in the mathematical models obtained. The results of the modeling were also correlated with the experimental ones. The results offer promising opportunities to develop a clean process and inexpensive technology to eliminate or reduce wastewater,Keywords: electrocoagulation, green process, experimental design, optimization
Procedia PDF Downloads 952012 A Simple and Easy-To-Use Tool for Detecting Outer Contour of Leukocytes Based on Image Processing Techniques
Authors: Retno Supriyanti, Best Leader Nababan, Yogi Ramadhani, Wahyu Siswandari
Abstract:
Blood cell morphology is an important parameter in a hematology test. Currently, in developing countries, a lot of hematology is done manually, either by physicians or laboratory staff. According to the limitation of the human eye, examination based on manual method will result in a lower precision and accuracy. In addition, the hematology test by manual will further complicate the diagnosis in some areas that do not have competent medical personnel. This research aims to develop a simple tool in the detection of blood cell morphology-based computer. In this paper, we focus on the detection of the outer contour of leukocytes. The results show that the system that we developed is promising for detecting blood cell morphology automatically. It is expected, by implementing this method, the problem of accuracy, precision and limitations of the medical staff can be solved.Keywords: morphology operation, developing countries, hematology test, limitation of medical personnel
Procedia PDF Downloads 335