Search results for: prewitt edge detection algorithm
3566 Effective Corporate Image Management as a Strategy for Enhancing Profitability
Authors: Shola Haruna Adeosun, Ajoke F. Adebiyi
Abstract:
Business organizations in Nigeria have failed to realize the role of a good corporate image policy in business dealings. This is probably because they do not understand the concept of corporate image and the necessary tools for promoting it. Corporate image goes beyond attractive products or rendering quality services, advertising and paying good salary. It pervades every aspect of business concern, from the least worker’s personality to the dealings within the organization and with the large society. In the face of the societal dynamics, especially in the business world, brought by technology, companies are faced with stiff competition that maintaining a competitive edge requires aggressive strategies. One of such strategies in effective corporate image management is promotion. This study investigates the strategies that could be deployed in order to build and promote the effective corporate image, as well as enhance profit margins of an organization, using Phinomar Nigeria Limited, Ngwo as case study. The study reveals that Phinomar Nigeria Limited has a laid down corporate image policy but not effectively managed; and that, strategies deployed to promote corporate image are limited; while responses to Phinomar products are fairly high. It, therefore, suggests profitable products but requires periodical improvement in the employee's welfare and work environment; as well as, the need to increase the scope of Phinomar’s social responsibility.Keywords: corporate image, effective, enhancing, management, profitability, strategy
Procedia PDF Downloads 3163565 Assessment of Metal and Nano-Metal Doped TiO₂ Nanoparticles for Photocatalytic Degradation of Methylene Blue in Almeda Textile Industry, Tigray, Ethiopia
Authors: Mulugeta Gurum Gerechal
Abstract:
Nowadays, the photocatalytic mechanism of water purification using nanoparticles has gained wider acceptance. For this purpose, the Crystal form of N- TiO₂ and Ag-TiO₂ was prepared from TiCl₄, Urea, NH₄OH and AgNO₃ by sol-gel method and simple solid phase reaction followed by calcination at a temperature of 400 °C for 4h at each. The synthesized photocatalysts were characterized using XRD, SEM and UV-visible diffuse reflectance spectra. In the experiment, it was found that the absorption edge of N-TiO₂ was a well efficient shift to visible light as compared to Ag-TiO₂. The XRD diffraction makes the particle size of N-TiO₂ smaller than Ag-TiO₂. The effect of catalyst loading and the effect of temperature on the photocatalytic efficiency of the prepared samples was tested using methylene blue as a target pollutant. The photocatalytic degradation efficiency of the catalysts for methylene blue was increased from 57.05 to 96.02% under solar radiation as the amount of the catalyst increased from 0.15 to 0.45 gram for N-TiO₂. Similarly, photocatalytic degradation of methylene blue was increased from 40.32 to 81.21% as the amount of Ag-TiO₂ increased from 0.05g to 0.1g. In addition, the photocatalytic degradation efficiency of the catalysts for the removal of methylene blue was increased from 58.00 to 98.00 and 47.00 to 81.21 % under solar radiation as the calcination temperature of the catalyst increased from 300 to 500 for N-TiO₂ for Ag-TiO₂ 300 to 4000C. However, a further increase in catalyst loading and calcination temperature was found to decrease the degradation efficiency.Keywords: photocatalysis, degradation, nanoparticles, catalyst loading, calcination and methylene blue
Procedia PDF Downloads 683564 Leveraging Large Language Models to Build a Cutting-Edge French Word Sense Disambiguation Corpus
Authors: Mouheb Mehdoui, Amel Fraisse, Mounir Zrigui
Abstract:
With the increasing amount of data circulating over the Web, there is a growing need to develop and deploy tools aimed at unraveling semantic nuances within text or sentences. The challenges in extracting precise meanings arise from the complexity of natural language, while words usually have multiple interpretations depending on the context. The challenge of precisely interpreting words within a given context is what the task of Word Sense Disambiguation meets. It is a very old domain within the area of Natural Language Processing aimed at determining a word’s meaning that it is going to carry in a particular context, hence increasing the correctness of applications processing the language. Numerous linguistic resources are accessible online, including WordNet, thesauri, and dictionaries, enabling exploration of diverse contextual meanings. However, several limitations persist. These include the scarcity of resources for certain languages, a limited number of examples within corpora, and the challenge of accurately detecting the topic or context covered by text, which significantly impacts word sense disambiguation. This paper will discuss the different approaches to WSD and review corpora available for this task. We will contrast these approaches, highlighting the limitations, which will allow us to build a corpus in French, targeted for WSD.Keywords: semantic enrichment, disambiguation, context fusion, natural language processing, multilingual applications
Procedia PDF Downloads 213563 Cooperative Agents to Prevent and Mitigate Distributed Denial of Service Attacks of Internet of Things Devices in Transportation Systems
Authors: Borhan Marzougui
Abstract:
Road and Transport Authority (RTA) is moving ahead with the implementation of the leader’s vision in exploring all avenues that may bring better security and safety services to the community. Smart transport means using smart technologies such as IoT (Internet of Things). This technology continues to affirm its important role in the context of Information and Transportation Systems. In fact, IoT is a network of Internet-connected objects able to collect and exchange different data using embedded sensors. With the growth of IoT, Distributed Denial of Service (DDoS) attacks is also growing exponentially. DDoS attacks are the major and a real threat to various transportation services. Currently, the defense mechanisms are mainly passive in nature, and there is a need to develop a smart technique to handle them. In fact, new IoT devices are being used into a botnet for DDoS attackers to accumulate for attacker purposes. The aim of this paper is to provide a relevant understanding of dangerous types of DDoS attack related to IoT and to provide valuable guidance for the future IoT security method. Our methodology is based on development of the distributed algorithm. This algorithm manipulates dedicated intelligent and cooperative agents to prevent and to mitigate DDOS attacks. The proposed technique ensure a preventive action when a malicious packets start to be distributed through the connected node (Network of IoT devices). In addition, the devices such as camera and radio frequency identification (RFID) are connected within the secured network, and the data generated by it are analyzed in real time by intelligent and cooperative agents. The proposed security system is based on a multi-agent system. The obtained result has shown a significant reduction of a number of infected devices and enhanced the capabilities of different security dispositives.Keywords: IoT, DDoS, attacks, botnet, security, agents
Procedia PDF Downloads 1463562 An Investigation Enhancing E-Voting Application Performance
Authors: Aditya Verma
Abstract:
E-voting using blockchain provides us with a distributed system where data is present on each node present in the network and is reliable and secure too due to its immutability property. This work compares various blockchain consensus algorithms used for e-voting applications in the past, based on performance and node scalability, and chooses the optimal one and improves on one such previous implementation by proposing solutions for the loopholes of the optimally working blockchain consensus algorithm, in our chosen application, e-voting.Keywords: blockchain, parallel bft, consensus algorithms, performance
Procedia PDF Downloads 1713561 Snapchat’s Scanning Feature
Authors: Reham Banwair, Lana Alshehri, Sara Hadrawi
Abstract:
The purpose of this project is to identify user satisfaction with the AI functions on Snapchat, in order to generate improvement proposals that allow its development within the app. To achieve this, a qualitative analysis was carried out through interviews to people who usually use the application, revealing their satisfaction or dissatisfaction with the usefulness of the AI. In addition, the background of the company and its introduction in these algorithms were analyzed. Furthermore, the characteristics of the three main functions of AI were explained: identify songs, solve mathematical problems, and recognize plants. As a result, it was obtained that 50% still do not know the characteristics of AI, 50% still believe song recognition is not always correct, 41.7% believe that math problems are usually accurate and 91.7% believes the plant detection tool is working properly.Keywords: artificial intelligence, scanning, Snapchat, machine learning
Procedia PDF Downloads 1393560 A Review on the Perception of Beşiktaş Public Square
Authors: Neslinur Hizli, Berrak Kirbaş Akyürek
Abstract:
Beşiktaş, one of the historical coastal district of İstanbul, is on the very edge of the radical transformation because of an approaching ‘Beşiktaş Public Square Project’. At this juncture, due its location, presence on the coast, population density and distance to the other centers of the city, the decisions to be taken are critical to whole Istanbul that will be majorly affected from this transformation. As the new project aims to pedestrianize the area by placing the vehicular traffic under the ground, Beşiktaş and its square will change from top to bottom. Among those considerations, through the advantages and disadvantages the perception of the existing conditions of the Beşiktaş play significant role. The motive of this paper is the lack of determination and clarity on the cognition of the Square. After brief analysis on the historical transformation of the area, prominent studies on the criteria of public square are revised. Through cognitive mapping methodology, characteristics of the Square and the public space in general find a place to discuss from individual views. This study aims to discuss and review Beşiktaş Public Square from perspective, mind and behavior of the users. Cognitive map study with thirty subjects (30) is evaluated and categorized upon the five elements that Kevin Lynch defined as the images of the city. The results obtained digitized and represented with tables and graphs. Findings of the research underline the crucial issues on the approaching change in Beşiktaş. Thus, this study may help to develop comprehensive ideas and new suggestions on the Square.Keywords: Beşiktaş public square, cognitive map, perception, public space
Procedia PDF Downloads 2713559 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby
Authors: Jazim Sohail, Filipe Teixeira-Dias
Abstract:
Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI
Procedia PDF Downloads 2193558 Behavior of Cold Formed Steel in Trusses
Authors: Reinhard Hermawan Lasut, Henki Wibowo Ashadi
Abstract:
The use of materials in Indonesia's construction sector requires engineers and practitioners to develop efficient construction technology, one of the materials used in cold-formed steel. Generally, the use of cold-formed steel is used in the construction of roof trusses found in houses or factories. The failure of the roof truss structure causes errors in the calculation analysis in the form of cross-sectional dimensions or frame configuration. The roof truss structure, vertical distance effect to the span length at the edge of the frame carries the compressive load. If the span is too long, local buckling will occur which causes problems in the frame strength. The model analysis uses various shapes of roof trusses, span lengths and angles with analysis of the structural stiffness matrix method. Model trusses with one-fifth shortened span and one-sixth shortened span also The trusses model is reviewed with increasing angles. It can be concluded that the trusses model by shortening the span in the compression area can reduce deflection and the model by increasing the angle does not get good results because the higher the roof, the heavier the load carried by the roof so that the force is not channeled properly. The shape of the truss must be calculated correctly so the truss is able to withstand the working load so that there is no structural failure.Keywords: cold-formed, trusses, deflection, stiffness matrix method
Procedia PDF Downloads 1703557 Solar Calculations of Modified Arch (Semi-Spherical) Type Greenhouse System for Bayburt City
Authors: Uğur Çakir, Erol Şahin, Kemal Çomakli, Ayşegül Çokgez Kuş
Abstract:
Solar energy is thought as main source of all energy sources on the world and it can be used in many applications like agricultural areas, heating cooling or direct electricity production directly or indirectly. Greenhousing is the first one of the agricultural activities that solar energy can be used directly in. Greenhouses offer us suitable conditions which can be controlled easily for the growth of the plant and they are made by using a covering material that allows the sun light entering into the system. Covering material can be glass, fiber glass, plastic or another transparent element. This study investigates the solar energy usability rates and solar energy benefiting rates of a semi-spherical (modified arch) type greenhouse system according to different orientations and positions which exists under climatic conditions of Bayburt. In the concept of this study it is tried to determine the best direction and best sizes of a semi-spherical greenhouse to get best solar benefit from the sun. To achieve this aim a modeling study is made by using MATLAB. However this modeling study is running for some determined shapes and greenhouses it can be used for different shaped greenhouses or buildings. The basic parameters are determined as greenhouse azimuth angle, the rate of size of long edge to short and seasonal solar energy gaining of greenhouse.Keywords: greenhousing, solar energy, direct radiation, renewable energy
Procedia PDF Downloads 4823556 Optimization of Structures with Mixed Integer Non-linear Programming (MINLP)
Authors: Stojan Kravanja, Andrej Ivanič, Tomaž Žula
Abstract:
This contribution focuses on structural optimization in civil engineering using mixed integer non-linear programming (MINLP). MINLP is characterized as a versatile method that can handle both continuous and discrete optimization variables simultaneously. Continuous variables are used to optimize parameters such as dimensions, stresses, masses, or costs, while discrete variables represent binary decisions to determine the presence or absence of structural elements within a structure while also calculating discrete materials and standard sections. The optimization process is divided into three main steps. First, a mechanical superstructure with a variety of different topology-, material- and dimensional alternatives. Next, a MINLP model is formulated to encapsulate the optimization problem. Finally, an optimal solution is searched in the direction of the defined objective function while respecting the structural constraints. The economic or mass objective function of the material and labor costs of a structure is subjected to the constraints known from structural analysis. These constraints include equations for the calculation of internal forces and deflections, as well as equations for the dimensioning of structural components (in accordance with the Eurocode standards). Given the complex, non-convex and highly non-linear nature of optimization problems in civil engineering, the Modified Outer-Approximation/Equality-Relaxation (OA/ER) algorithm is applied. This algorithm alternately solves subproblems of non-linear programming (NLP) and main problems of mixed-integer linear programming (MILP), in this way gradually refines the solution space up to the optimal solution. The NLP corresponds to the continuous optimization of parameters (with fixed topology, discrete materials and standard dimensions, all determined in the previous MILP), while the MILP involves a global approximation to the superstructure of alternatives, where a new topology, materials, standard dimensions are determined. The optimization of a convex problem is stopped when the MILP solution becomes better than the best NLP solution. Otherwise, it is terminated when the NLP solution can no longer be improved. While the OA/ER algorithm, like all other algorithms, does not guarantee global optimality due to the presence of non-convex functions, various modifications, including convexity tests, are implemented in OA/ER to mitigate these difficulties. The effectiveness of the proposed MINLP approach is demonstrated by its application to various structural optimization tasks, such as mass optimization of steel buildings, cost optimization of timber halls, composite floor systems, etc. Special optimization models have been developed for the optimization of these structures. The MINLP optimizations, facilitated by the user-friendly software package MIPSYN, provide insights into a mass or cost-optimal solutions, optimal structural topologies, optimal material and standard cross-section choices, confirming MINLP as a valuable method for the optimization of structures in civil engineering.Keywords: MINLP, mixed-integer non-linear programming, optimization, structures
Procedia PDF Downloads 513555 Design, Analysis and Obstacle Avoidance Control of an Electric Wheelchair with Sit-Sleep-Seat Elevation Functions
Authors: Waleed Ahmed, Huang Xiaohua, Wilayat Ali
Abstract:
The wheelchair users are generally exposed to physical and psychological health problems, e.g., pressure sores and pain in the hip joint, associated with seating posture or being inactive in a wheelchair for a long time. Reclining Wheelchair with back, thigh, and leg adjustment helps in daily life activities and health preservation. The seat elevating function of an electric wheelchair allows the user (lower limb amputation) to reach different heights. An electric wheelchair is expected to ease the lives of the elderly and disable people by giving them mobility support and decreasing the percentage of accidents caused by users’ narrow sight or joystick operation errors. Thus, this paper proposed the design, analysis and obstacle avoidance control of an electric wheelchair with sit-sleep-seat elevation functions. A 3D model of a wheelchair is designed in SolidWorks that was later used for multi-body dynamic (MBD) analysis and to verify driving control system. The control system uses the fuzzy algorithm to avoid the obstacle by getting information in the form of distance from the ultrasonic sensor and user-specified direction from the joystick’s operation. The proposed fuzzy driving control system focuses on the direction and velocity of the wheelchair. The wheelchair model has been examined and proven in MSC Adams (Automated Dynamic Analysis of Mechanical Systems). The designed fuzzy control algorithm is implemented on Gazebo robotic 3D simulator using Robotic Operating System (ROS) middleware. The proposed wheelchair design enhanced mobility and quality of life by improving the user’s functional capabilities. Simulation results verify the non-accidental behavior of the electric wheelchair.Keywords: fuzzy logic control, joystick, multi body dynamics, obstacle avoidance, scissor mechanism, sensor
Procedia PDF Downloads 1303554 A Benchmark System for Testing Medium Voltage Direct Current (MVDC-CB) Robustness Utilizing Real Time Digital Simulation and Hardware-In-Loop Theory
Authors: Ali Kadivar, Kaveh Niayesh
Abstract:
The integration of green energy resources is a major focus, and the role of Medium Voltage Direct Current (MVDC) systems is exponentially expanding. However, the protection of MVDC systems against DC faults is a challenge that can have consequences on reliable and safe grid operation. This challenge reveals the need for MVDC circuit breakers (MVDC CB), which are in infancies of their improvement. Therefore will be a lack of MVDC CBs standards, including thresholds for acceptable power losses and operation speed. To establish a baseline for comparison purposes, a benchmark system for testing future MVDC CBs is vital. The literatures just give the timing sequence of each switch and the emphasis is on the topology, without in-depth study on the control algorithm of DCCB, as the circuit breaker control system is not yet systematic. A digital testing benchmark is designed for the Proof-of-concept of simulation studies using software models. It can validate studies based on real-time digital simulators and Transient Network Analyzer (TNA) models. The proposed experimental setup utilizes data accusation from the accurate sensors installed on the tested MVDC CB and through general purpose input/outputs (GPIO) from the microcontroller and PC Prototype studies in the laboratory-based models utilizing Hardware-in-the-Loop (HIL) equipment connected to real-time digital simulators is achieved. The improved control algorithm of the circuit breaker can reduce the peak fault current and avoid arc resignation, helping the coordination of DCCB in relay protection. Moreover, several research gaps are identified regarding case studies and evaluation approaches.Keywords: DC circuit breaker, hardware-in-the-loop, real time digital simulation, testing benchmark
Procedia PDF Downloads 843553 A Geosynchronous Orbit Synthetic Aperture Radar Simulator for Moving Ship Targets
Authors: Linjie Zhang, Baifen Ren, Xi Zhang, Genwang Liu
Abstract:
Ship detection is of great significance for both military and civilian applications. Synthetic aperture radar (SAR) with all-day, all-weather, ultra-long-range characteristics, has been used widely. In view of the low time resolution of low orbit SAR and the needs for high time resolution SAR data, GEO (Geosynchronous orbit) SAR is getting more and more attention. Since GEO SAR has short revisiting period and large coverage area, it is expected to be well utilized in marine ship targets monitoring. However, the height of the orbit increases the time of integration by almost two orders of magnitude. For moving marine vessels, the utility and efficacy of GEO SAR are still not sure. This paper attempts to find the feasibility of GEO SAR by giving a GEO SAR simulator of moving ships. This presented GEO SAR simulator is a kind of geometrical-based radar imaging simulator, which focus on geometrical quality rather than high radiometric. Inputs of this simulator are 3D ship model (.obj format, produced by most 3D design software, such as 3D Max), ship's velocity, and the parameters of satellite orbit and SAR platform. Its outputs are simulated GEO SAR raw signal data and SAR image. This simulating process is accomplished by the following four steps. (1) Reading 3D model, including the ship rotations (pitch, yaw, and roll) and velocity (speed and direction) parameters, extract information of those little primitives (triangles) which is visible from the SAR platform. (2) Computing the radar scattering from the ship with physical optics (PO) method. In this step, the vessel is sliced into many little rectangles primitives along the azimuth. The radiometric calculation of each primitive is carried out separately. Since this simulator only focuses on the complex structure of ships, only single-bounce reflection and double-bounce reflection are considered. (3) Generating the raw data with GEO SAR signal modeling. Since the normal ‘stop and go’ model is not available for GEO SAR, the range model should be reconsidered. (4) At last, generating GEO SAR image with improved Range Doppler method. Numerical simulation of fishing boat and cargo ship will be given. GEO SAR images of different posture, velocity, satellite orbit, and SAR platform will be simulated. By analyzing these simulated results, the effectiveness of GEO SAR for the detection of marine moving vessels is evaluated.Keywords: GEO SAR, radar, simulation, ship
Procedia PDF Downloads 1813552 A Framework of Dynamic Rule Selection Method for Dynamic Flexible Job Shop Problem by Reinforcement Learning Method
Authors: Rui Wu
Abstract:
In the volatile modern manufacturing environment, new orders randomly occur at any time, while the pre-emptive methods are infeasible. This leads to a real-time scheduling method that can produce a reasonably good schedule quickly. The dynamic Flexible Job Shop problem is an NP-hard scheduling problem that hybrid the dynamic Job Shop problem with the Parallel Machine problem. A Flexible Job Shop contains different work centres. Each work centre contains parallel machines that can process certain operations. Many algorithms, such as genetic algorithms or simulated annealing, have been proposed to solve the static Flexible Job Shop problems. However, the time efficiency of these methods is low, and these methods are not feasible in a dynamic scheduling problem. Therefore, a dynamic rule selection scheduling system based on the reinforcement learning method is proposed in this research, in which the dynamic Flexible Job Shop problem is divided into several parallel machine problems to decrease the complexity of the dynamic Flexible Job Shop problem. Firstly, the features of jobs, machines, work centres, and flexible job shops are selected to describe the status of the dynamic Flexible Job Shop problem at each decision point in each work centre. Secondly, a framework of reinforcement learning algorithm using a double-layer deep Q-learning network is applied to select proper composite dispatching rules based on the status of each work centre. Then, based on the selected composite dispatching rule, an available operation is selected from the waiting buffer and assigned to an available machine in each work centre. Finally, the proposed algorithm will be compared with well-known dispatching rules on objectives of mean tardiness, mean flow time, mean waiting time, or mean percentage of waiting time in the real-time Flexible Job Shop problem. The result of the simulations proved that the proposed framework has reasonable performance and time efficiency.Keywords: dynamic scheduling problem, flexible job shop, dispatching rules, deep reinforcement learning
Procedia PDF Downloads 1123551 Overview of Multi-Chip Alternatives for 2.5 and 3D Integrated Circuit Packagings
Authors: Ching-Feng Chen, Ching-Chih Tsai
Abstract:
With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to the development of the high numerical aperture (high-NA) lithography equipment and other issues such as short channel effects. In the context of the ever-increasing technical requirements of portable devices and high-performance computing, relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (ICs) based on the updated transistor structure and technology nodes. The author concludes that multi-chip solutions for 2.5D and 3D IC packagings are feasible to prolong Moore’s Law.Keywords: moore’s law, high numerical aperture, power consumption-performance-area-cost-cycle time to market, 2.5 and 3D- very-large-scale integration, packaging, through silicon via
Procedia PDF Downloads 1193550 Effects of the Slope Embankment Variation on Influence Areas That Causes the Differential Settlement around of Embankment
Authors: Safitri W. Nur, Prathisto Panuntun L. Unggul, M. Ivan Adi Perdana, R. Dary Wira Mahadika
Abstract:
On soft soil areas, high embankment as a preloading needed to improve the bearing capacity of the soil. For sustainable development, the construction of embankment must not disturb the area around of them. So, the influence area must be known before the contractor applied their embankment design. For several cases in Indonesia, the area around of embankment construction is housing resident and other building. So that, the influence area must be identified to avoid the differential settlement occurs on the buildings around of them. Differential settlement causes the building crack. Each building has a limited tolerance for the differential settlement. For concrete buildings, the tolerance is 0,002 – 0,003 m and for steel buildings, the tolerance is 0,006 – 0,008 m. If the differential settlement stands on the range of that value, building crack can be avoided. In fact, the settlement around of embankment is assumed as zero. Because of that, so many problems happen when high embankment applied on soft soil area. This research used the superposition method combined with plaxis analysis to know the influences area around of embankment in some location with the differential characteristic of the soft soil. The undisturbed soil samples take on 55 locations with undisturbed soil samples at some soft soils location in Indonesia. Based on this research, it was concluded that the effects of embankment variation are if more gentle the slope, the influence area will be greater and vice versa. The largest of the influence area with h initial embankment equal to 2 - 6 m with slopes 1:1, 1:2, 1:3, 1:4, 1:5, 1:6, 1:7, 1:8 is 32 m from the edge of the embankment.Keywords: differential settlement, embankment, influence area, slope, soft soil
Procedia PDF Downloads 4133549 Functional Analysis of Variants Implicated in Hearing Loss in a Cohort from Argentina: From Molecular Diagnosis to Pre-Clinical Research
Authors: Paula I. Buonfiglio, Carlos David Bruque, Lucia Salatino, Vanesa Lotersztein, Sebastián Menazzi, Paola Plazas, Ana Belén Elgoyhen, Viviana Dalamón
Abstract:
Hearing loss (HL) is the most prevalent sensorineural disorder affecting about 10% of the global population, with more than half due to genetic causes. About 1 in 500-1000 newborns present congenital HL. Most of the patients are non-syndromic with an autosomal recessive mode of inheritance. To date, more than 100 genes are related to HL. Therefore, the Whole-exome sequencing (WES) technique has become a cost-effective alternative approach for molecular diagnosis. Nevertheless, new challenges arise from the detection of novel variants, in particular missense changes, which can lead to a spectrum of genotype-to-phenotype correlations, which is not always straightforward. In this work, we aimed to identify the genetic causes of HL in isolated and familial cases by designing a multistep approach to analyze target genes related to hearing impairment. Moreover, we performed in silico and in vivo analyses in order to further study the effect of some of the novel variants identified in the hair cell function using the zebrafish model. A total of 650 patients were studied by Sanger Sequencing and Gap-PCR in GJB2 and GJB6 genes, respectively, diagnosing 15.5% of sporadic cases and 36% of familial ones. Overall, 50 different sequence variants were detected. Fifty of the undiagnosed patients with moderate HL were tested for deletions in STRC gene by Multiplex ligation-dependent probe amplification technique (MLPA), leading to 6% of diagnosis. After this initial screening, 50 families were selected to be analyzed by WES, achieving diagnosis in 44% of them. Half of the identified variants were novel. A missense variant in MYO6 gene detected in a family with postlingual HL was selected to be further analyzed. A protein modeling with AlphaFold2 software was performed, proving its pathogenic effect. In order to functionally validate this novel variant, a knockdown phenotype rescue assay in zebrafish was carried out. Injection of wild-type MYO6 mRNA in embryos rescued the phenotype, whereas using the mutant MYO6 mRNA (carrying c.2782C>A variant) had no effect. These results strongly suggest the deleterious effect of this variant on the mobility of stereocilia in zebrafish neuromasts, and hence on the auditory system. In the present work, we demonstrated that our algorithm is suitable for the sequential multigenic approach to HL in our cohort. These results highlight the importance of a combined strategy in order to identify candidate variants as well as the in silico and in vivo studies to analyze and prove their pathogenicity and accomplish a better understanding of the mechanisms underlying the physiopathology of the hearing impairment.Keywords: diagnosis, genetics, hearing loss, in silico analysis, in vivo analysis, WES, zebrafish
Procedia PDF Downloads 993548 Strategies for Success: Strategic Thinking’s Critical Role in Entrepreneurial
Authors: Silvia Rahmita
Abstract:
Entrepreneurial success is crucial for economic growth, competitiveness, and job creation, yet many entrepreneurs face failure due to various challenges. This paper explores the critical role of strategic thinking in mitigating entrepreneurial failure. Entrepreneurial competencies—encompassing knowledge, skills, and traits—are essential for creating and growing ventures. Despite these competencies, numerous entrepreneurs fail due to poor management, inadequate support, and ineffective policies. The paper categorizes entrepreneurial failures into financial, operational, market, product or service, strategic, leadership, legal, human capital, technological, and environmental failures. Each failure type can be addressed through strategic thinking, which involves foresight, balancing short-term and long-term goals, and hypothesis-driven processes. By integrating strategic thinking into their approach, entrepreneurs can enhance risk management, adapt to market changes, and sustain growth. This process involves setting clear goals, innovating products, and maintaining a competitive edge. Ultimately, strategic thinking provides a framework for proactive planning, adaptation, and continuous improvement, reducing the likelihood of failure and ensuring long-term success. Entrepreneurs who prioritize strategic thinking are better equipped to navigate the complexities of the business environment and achieve sustainable growth.Keywords: entrepreneurial failure, strategic thinking, risk management, business failure
Procedia PDF Downloads 473547 Comparison of Monte Carlo Simulations and Experimental Results for the Measurement of Complex DNA Damage Induced by Ionizing Radiations of Different Quality
Authors: Ifigeneia V. Mavragani, Zacharenia Nikitaki, George Kalantzis, George Iliakis, Alexandros G. Georgakilas
Abstract:
Complex DNA damage consisting of a combination of DNA lesions, such as Double Strand Breaks (DSBs) and non-DSB base lesions occurring in a small volume is considered as one of the most important biological endpoints regarding ionizing radiation (IR) exposure. Strong theoretical (Monte Carlo simulations) and experimental evidence suggests an increment of the complexity of DNA damage and therefore repair resistance with increasing linear energy transfer (LET). Experimental detection of complex (clustered) DNA damage is often associated with technical deficiencies limiting its measurement, especially in cellular or tissue systems. Our groups have recently made significant improvements towards the identification of key parameters relating to the efficient detection of complex DSBs and non-DSBs in human cellular systems exposed to IR of varying quality (γ-, X-rays 0.3-1 keV/μm, α-particles 116 keV/μm and 36Ar ions 270 keV/μm). The induction and processing of DSB and non-DSB-oxidative clusters were measured using adaptations of immunofluorescence (γH2AX or 53PB1 foci staining as DSB probes and human repair enzymes OGG1 or APE1 as probes for oxidized purines and abasic sites respectively). In the current study, Relative Biological Effectiveness (RBE) values for DSB and non-DSB induction have been measured in different human normal (FEP18-11-T1) and cancerous cell lines (MCF7, HepG2, A549, MO59K/J). The experimental results are compared to simulation data obtained using a validated microdosimetric fast Monte Carlo DNA Damage Simulation code (MCDS). Moreover, this simulation approach is implemented in two realistic clinical cases, i.e. prostate cancer treatment using X-rays generated by a linear accelerator and a pediatric osteosarcoma case using a 200.6 MeV proton pencil beam. RBE values for complex DNA damage induction are calculated for the tumor areas. These results reveal a disparity between theory and experiment and underline the necessity for implementing highly precise and more efficient experimental and simulation approaches.Keywords: complex DNA damage, DNA damage simulation, protons, radiotherapy
Procedia PDF Downloads 3293546 Encoded Nanospheres for the Fast Ratiometric Detection of Cystic Fibrosis
Authors: Iván Castelló, Georgiana Stoica, Emilio Palomares, Fernando Bravo
Abstract:
We present herein two colour encoded silica nanospheres (2nanoSi) for the fluorescence quantitative ratiometric determination of trypsin in humans. The system proved to be a faster (minutes) method, with two times higher sensitivity than the state-of-the-art biomarkers based sensors for cystic fibrosis (CF), allowing the quantification of trypsin concentrations in a wide range (0-350 mg/L). Furthermore, as trypsin is directly related to the development of cystic fibrosis, different human genotypes, i.e. healthy homozygotic (> 80 mg/L), CF homozygotic (< 50 mg/L), and heterozygotic (> 50 mg/L), respectively, can be determined using our 2nanoSi nanospheres.Keywords: cystic fibrosis, trypsin, quantum dots, biomarker, homozygote, heterozygote
Procedia PDF Downloads 4893545 Simulation of Stress in Graphite Anode of Lithium-Ion Battery: Intra and Inter-Particle
Authors: Wenxin Mei, Jinhua Sun, Qingsong Wang
Abstract:
The volume expansion of lithium-ion batteries is mainly induced by intercalation induced stress within the negative electrode, resulting in capacity degradation and even battery failure. Stress generation due to lithium intercalation into graphite particles is investigated based on an electrochemical-mechanical model in this work. The two-dimensional model presented is fully coupled, inclusive of the impacts of intercalation-induced stress, stress-induced intercalation, to evaluate the lithium concentration, stress generation, and displacement intra and inter-particle. The results show that the distribution of lithium concentration and stress exhibits an analogous pattern, which reflects the relation between lithium diffusion and stress. The results of inter-particle stress indicate that larger Von-Mises stress is displayed where the two particles are in contact with each other, and deformation at the edge of particles is also observed, predicting fracture. Additionally, the maximum inter-particle stress at the end of lithium intercalation is nearly ten times the intraparticle stress. And the maximum inter-particle displacement is increased by 24% compared to the single-particle. Finally, the effect of graphite particle arrangement on inter-particle stress is studied. It is found that inter-particle stress with tighter arrangement exhibits lower stress. This work can provide guidance for predicting the intra and inter-particle stress to take measures to avoid cracking of electrode material.Keywords: electrochemical-mechanical model, graphite particle, lithium concentration, lithium ion battery, stress
Procedia PDF Downloads 2033544 High Responsivity of Zirconium boride/Chromium Alloy Heterostructure for Deep and Near UV Photodetector
Authors: Sanjida Akter, Ambali Alade Odebowale, Andrey E. Miroshnichenko, Haroldo T. Hattori
Abstract:
Photodetectors (PDs) play a pivotal role in optoelectronics and optical devices, serving as fundamental components that convert light signals into electrical signals. As the field progresses, the integration of advanced materials with unique optical properties has become a focal point, paving the way for the innovation of novel PDs. This study delves into the exploration of a cutting-edge photodetector designed for deep and near ultraviolet (UV) applications. The photodetector is constructed with a composite of Zirconium Boride (ZrB2) and Chromium (Cr) alloy, deposited onto a 6H nitrogen-doped silicon carbide substrate. The determination of the optimal alloy thickness is achieved through Finite-Difference Time-Domain (FDTD) simulation, and the synthesis of the alloy is accomplished using radio frequency (RF) sputtering. Remarkably, the resulting photodetector exhibits an exceptional responsivity of 3.5 A/W under an applied voltage of -2 V, at wavelengths of 405 nm and 280 nm. This heterostructure not only exemplifies high performance but also provides a versatile platform for the development of near UV photodetectors capable of operating effectively in challenging conditions, such as environments characterized by high power and elevated temperatures. This study contributes to the expanding landscape of photodetector technology, offering a promising avenue for the advancement of optoelectronic devices in demanding applications.Keywords: responsivity, silicon carbide, ultraviolet photodetector, zirconium boride
Procedia PDF Downloads 703543 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software
Authors: Carlos Gonzalez
Abstract:
This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.Keywords: internet, secure software, threats, cryptography process
Procedia PDF Downloads 3373542 Best-Performing Color Space for Land-Sea Segmentation Using Wavelet Transform Color-Texture Features and Fusion of over Segmentation
Authors: Seynabou Toure, Oumar Diop, Kidiyo Kpalma, Amadou S. Maiga
Abstract:
Color and texture are the two most determinant elements for perception and recognition of the objects in an image. For this reason, color and texture analysis find a large field of application, for example in image classification and segmentation. But, the pioneering work in texture analysis was conducted on grayscale images, thus discarding color information. Many grey-level texture descriptors have been proposed and successfully used in numerous domains for image classification: face recognition, industrial inspections, food science medical imaging among others. Taking into account color in the definition of these descriptors makes it possible to better characterize images. Color texture is thus the subject of recent work, and the analysis of color texture images is increasingly attracting interest in the scientific community. In optical remote sensing systems, sensors measure separately different parts of the electromagnetic spectrum; the visible ones and even those that are invisible to the human eye. The amounts of light reflected by the earth in spectral bands are then transformed into grayscale images. The primary natural colors Red (R) Green (G) and Blue (B) are then used in mixtures of different spectral bands in order to produce RGB images. Thus, good color texture discrimination can be achieved using RGB under controlled illumination conditions. Some previous works investigate the effect of using different color space for color texture classification. However, the selection of the best performing color space in land-sea segmentation is an open question. Its resolution may bring considerable improvements in certain applications like coastline detection, where the detection result is strongly dependent on the performance of the land-sea segmentation. The aim of this paper is to present the results of a study conducted on different color spaces in order to show the best-performing color space for land-sea segmentation. In this sense, an experimental analysis is carried out using five different color spaces (RGB, XYZ, Lab, HSV, YCbCr). For each color space, the Haar wavelet decomposition is used to extract different color texture features. These color texture features are then used for Fusion of Over Segmentation (FOOS) based classification; this allows segmentation of the land part from the sea one. By analyzing the different results of this study, the HSV color space is found as the best classification performance while using color and texture features; which is perfectly coherent with the results presented in the literature.Keywords: classification, coastline, color, sea-land segmentation
Procedia PDF Downloads 2533541 Identification of Vehicle Dynamic Parameters by Using Optimized Exciting Trajectory on 3- DOF Parallel Manipulator
Authors: Di Yao, Gunther Prokop, Kay Buttner
Abstract:
Dynamic parameters, including the center of gravity, mass and inertia moments of vehicle, play an essential role in vehicle simulation, collision test and real-time control of vehicle active systems. To identify the important vehicle dynamic parameters, a systematic parameter identification procedure is studied in this work. In the first step of the procedure, a conceptual parallel manipulator (virtual test rig), which possesses three rotational degrees-of-freedom, is firstly proposed. To realize kinematic characteristics of the conceptual parallel manipulator, the kinematic analysis consists of inverse kinematic and singularity architecture is carried out. Based on the Euler's rotation equations for rigid body dynamics, the dynamic model of parallel manipulator and derivation of measurement matrix for parameter identification are presented subsequently. In order to reduce the sensitivity of parameter identification to measurement noise and other unexpected disturbances, a parameter optimization process of searching for optimal exciting trajectory of parallel manipulator is conducted in the following section. For this purpose, the 321-Euler-angles defined by parameterized finite-Fourier-series are primarily used to describe the general exciting trajectory of parallel manipulator. To minimize the condition number of measurement matrix for achieving better parameter identification accuracy, the unknown coefficients of parameterized finite-Fourier-series are estimated by employing an iterative algorithm based on MATLAB®. Meanwhile, the iterative algorithm will ensure the parallel manipulator still keeps in an achievable working status during the execution of optimal exciting trajectory. It is showed that the proposed procedure and methods in this work can effectively identify the vehicle dynamic parameters and could be an important application of parallel manipulator in the fields of parameter identification and test rig development.Keywords: parameter identification, parallel manipulator, singularity architecture, dynamic modelling, exciting trajectory
Procedia PDF Downloads 2703540 Hybrid MIMO-OFDM Detection Scheme for High Performance
Authors: Young-Min Ko, Dong-Hyun Ha, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
In recent years, a multi-antenna system is actively used to improve the performance of the communication. A MIMO-OFDM system can provide multiplexing gain or diversity gain. These gains are obtained in proportion to the increase of the number of antennas. In order to provide the optimal gain of the MIMO-OFDM system, various transmission and reception schemes are presented. This paper aims to propose a hybrid scheme that base station provides both diversity gain and multiplexing gain at the same time.Keywords: DFE, diversity gain, hybrid, MIMO, multiplexing gain.
Procedia PDF Downloads 6933539 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT
Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez
Abstract:
Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management
Procedia PDF Downloads 1443538 A Hybrid-Evolutionary Optimizer for Modeling the Process of Obtaining Bricks
Authors: Marius Gavrilescu, Sabina-Adriana Floria, Florin Leon, Silvia Curteanu, Costel Anton
Abstract:
Natural sciences provide a wide range of experimental data whose related problems require study and modeling beyond the capabilities of conventional methodologies. Such problems have solution spaces whose complexity and high dimensionality require correspondingly complex regression methods for proper characterization. In this context, we propose an optimization method which consists in a hybrid dual optimizer setup: a global optimizer based on a modified variant of the popular Imperialist Competitive Algorithm (ICA), and a local optimizer based on a gradient descent approach. The ICA is modified such that intermediate solution populations are more quickly and efficiently pruned of low-fitness individuals by appropriately altering the assimilation, revolution and competition phases, which, combined with an initialization strategy based on low-discrepancy sampling, allows for a more effective exploration of the corresponding solution space. Subsequently, gradient-based optimization is used locally to seek the optimal solution in the neighborhoods of the solutions found through the modified ICA. We use this combined approach to find the optimal configuration and weights of a fully-connected neural network, resulting in regression models used to characterize the process of obtained bricks using silicon-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. Thus, the purpose of our approach is to determine by simulation the working conditions, including the manufacturing mix recipe with the addition of different materials, to minimize the emissions represented by CO and CH4. Our approach determines regression models which perform significantly better than those found using the traditional ICA for the aforementioned problem, resulting in better convergence and a substantially lower error.Keywords: optimization, biologically inspired algorithm, regression models, bricks, emissions
Procedia PDF Downloads 863537 Enhancing Project Management Performance in Prefabricated Building Construction under Uncertainty: A Comprehensive Approach
Authors: Niyongabo Elyse
Abstract:
Prefabricated building construction is a pioneering approach that combines design, production, and assembly to attain energy efficiency, environmental sustainability, and economic feasibility. Despite continuous development in the industry in China, the low technical maturity of standardized design, factory production, and construction assembly introduces uncertainties affecting prefabricated component production and on-site assembly processes. This research focuses on enhancing project management performance under uncertainty to help enterprises navigate these challenges and optimize project resources. The study introduces a perspective on how uncertain factors influence the implementation of prefabricated building construction projects. It proposes a theoretical model considering project process management ability, adaptability to uncertain environments, and collaboration ability of project participants. The impact of uncertain factors is demonstrated through case studies and quantitative analysis, revealing constraints on implementation time, cost, quality, and safety. To address uncertainties in prefabricated component production scheduling, a fuzzy model is presented, expressing processing times in interval values. The model utilizes a cooperative co-evolution evolution algorithm (CCEA) to optimize scheduling, demonstrated through a real case study showcasing reduced project duration and minimized effects of processing time disturbances. Additionally, the research addresses on-site assembly construction scheduling, considering the relationship between task processing times and assigned resources. A multi-objective model with fuzzy activity durations is proposed, employing a hybrid cooperative co-evolution evolution algorithm (HCCEA) to optimize project scheduling. Results from real case studies indicate improved project performance in terms of duration, cost, and resilience to processing time delays and resource changes. The study also introduces a multistage dynamic process control model, utilizing IoT technology for real-time monitoring during component production and construction assembly. This approach dynamically adjusts schedules when constraints arise, leading to enhanced project management performance, as demonstrated in a real prefabricated housing project. Key contributions include a fuzzy prefabricated components production scheduling model, a multi-objective multi-mode resource-constrained construction project scheduling model with fuzzy activity durations, a multi-stage dynamic process control model, and a cooperative co-evolution evolution algorithm. The integrated mathematical model addresses the complexity of prefabricated building construction project management, providing a theoretical foundation for practical decision-making in the field.Keywords: prefabricated construction, project management performance, uncertainty, fuzzy scheduling
Procedia PDF Downloads 55