Search results for: parameter selection
1072 Chronolgy and Developments in Inventory Control Best Practices for FMCG Sector
Authors: Roopa Singh, Anurag Singh, Ajay
Abstract:
Agriculture contributes a major share in the national economy of India. A major portion of Indian economy (about 70%) depends upon agriculture as it forms the main source of income. About 43% of India’s geographical area is used for agricultural activity which involves 65-75% of total population of India. The given work deals with the Fast moving Consumer Goods (FMCG) industries and their inventories which use agricultural produce as their raw material or input for their final product. Since the beginning of inventory practices, many developments took place which can be categorised into three phases, based on the review of various works. The first phase is related with development and utilization of Economic Order Quantity (EOQ) model and methods for optimizing costs and profits. Second phase deals with inventory optimization method, with the purpose of balancing capital investment constraints and service level goals. The third and recent phase has merged inventory control with electrical control theory. Maintenance of inventory is considered negative, as a large amount of capital is blocked especially in mechanical and electrical industries. But the case is different in food processing and agro-based industries and their inventories due to cyclic variation in the cost of raw materials of such industries which is the reason for selection of these industries in the mentioned work. The application of electrical control theory in inventory control makes the decision-making highly instantaneous for FMCG industries without loss in their proposed profits, which happened earlier during first and second phases, mainly due to late implementation of decision. The work also replaces various inventories and work-in-progress (WIP) related errors with their monetary values, so that the decision-making is fully target-oriented.Keywords: control theory, inventory control, manufacturing sector, EOQ, feedback, FMCG sector
Procedia PDF Downloads 3541071 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover
Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae
Abstract:
Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling
Procedia PDF Downloads 1301070 Non-Linear Static Analysis of Screwed Moment Connections in Cold-Formed Steel Frames
Authors: Jikhil Joseph, Satish Kumar S R.
Abstract:
Cold-formed steel frames are preferable for framed constructions due to its low seismic weights and results into low seismic forces, but on the contrary, significant lateral deflections are expected under seismic/wind loading. The various factors affecting the lateral stiffness of steel frames are the stiffness of connections, beams and columns. So, by increasing the stiffness of beam, column and making the connections rigid will enhance the lateral stiffness. The present study focused on Structural elements made of rectangular hollow sections and fastened with screwed in-plane moment connections for the building frames. The self-drilling screws can be easily drilled on either side of the connection area with the help of gusset plates. The strength of screwed connections can be made 1.2 times the connecting elements. However, achieving high stiffness in connections is also a challenging job. Hence in addition to beam and column stiffness’s the connection stiffness are also going to be a governing parameter in the lateral deflections of the frames. SAP 2000 Non-linear static analysis has been planned to study the seismic behavior of steel frames. The SAP model will be consisting of nonlinear spring model for the connection to account the semi-rigid connections and the nonlinear hinges will be assigned for beam and column sections according to FEMA 273 guidelines. The reliable spring and hinge parameters will be assigned based on an experimental and analytical database. The non-linear static analysis is mainly focused on the identification of various hinge formations and the estimation of lateral deflection and these will contribute as an inputs for the direct displacement-based Seismic design. The research output from this study are the modelling techniques and suitable design guidelines for the performance-based seismic design of cold-formed steel frames.Keywords: buckling, cold formed steel, nonlinear static analysis, screwed connections
Procedia PDF Downloads 1791069 Simulation Study on Effects of Surfactant Properties on Surfactant Enhanced Oil Recovery from Fractured Reservoirs
Authors: Xiaoqian Cheng, Jon Kleppe, Ole Torsaeter
Abstract:
One objective of this work is to analyze the effects of surfactant properties (viscosity, concentration, and adsorption) on surfactant enhanced oil recovery at laboratory scale. The other objective is to obtain the functional relationships between surfactant properties and the ultimate oil recovery and oil recovery rate. A core is cut into two parts from the middle to imitate the matrix with a horizontal fracture. An injector and a producer are at the left and right sides of the fracture separately. The middle slice of the core is used as the model in this paper, whose size is 4cm x 0.1cm x 4.1cm, and the space of the fracture in the middle is 0.1 cm. The original properties of matrix, brine, oil in the base case are from Ekofisk Field. The properties of surfactant are from literature. Eclipse is used as the simulator. The results are followings: 1) The viscosity of surfactant solution has a positive linear relationship with surfactant oil recovery time. And the relationship between viscosity and oil production rate is an inverse function. The viscosity of surfactant solution has no obvious effect on ultimate oil recovery. Since most of the surfactant has no big effect on viscosity of brine, the viscosity of surfactant solution is not a key parameter of surfactant screening for surfactant flooding in fractured reservoirs. 2) The increase of surfactant concentration results a decrease of oil recovery rate and an increase of ultimate oil recovery. However, there are no functions could describe the relationships. Study on economy should be conducted because of the price of surfactant and oil. 3) In the study of surfactant adsorption, assume that the matrix wettability is changed to water-wet when the surfactant adsorption is to the maximum at all cases. And the ratio of surfactant adsorption and surfactant concentration (Cads/Csurf) is used to estimate the functional relationship. The results show that the relationship between ultimate oil recovery and Cads/Csurf is a logarithmic function. The oil production rate has a positive linear relationship with exp(Cads/Csurf). The work here could be used as a reference for the surfactant screening of surfactant enhanced oil recovery from fractured reservoirs. And the functional relationships between surfactant properties and the oil recovery rate and ultimate oil recovery help to improve upscaling methods.Keywords: fractured reservoirs, surfactant adsorption, surfactant concentration, surfactant EOR, surfactant viscosity
Procedia PDF Downloads 1741068 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system
Procedia PDF Downloads 1581067 Evaluating Structural Crack Propagation Induced by Soundless Chemical Demolition Agent Using an Energy Release Rate Approach
Authors: Shyaka Eugene
Abstract:
The efficient and safe demolition of structures is a critical challenge in civil engineering and construction. This study focuses on the development of optimal demolition strategies by investigating the crack propagation behavior in beams induced by soundless cracking agents. It is commonly used in controlled demolition and has gained prominence due to its non-explosive and environmentally friendly nature. This research employs a comprehensive experimental and computational approach to analyze the crack initiation, propagation, and eventual failure in beams subjected to soundless cracking agents. Experimental testing involves the application of various cracking agents under controlled conditions to understand their effects on the structural integrity of beams. High-resolution imaging and strain measurements are used to capture the crack propagation process. In parallel, numerical simulations are conducted using advanced finite element analysis (FEA) techniques to model crack propagation in beams, considering various parameters such as cracking agent composition, loading conditions, and beam properties. The FEA models are validated against experimental results, ensuring their accuracy in predicting crack propagation patterns. The findings of this study provide valuable insights into optimizing demolition strategies, allowing engineers and demolition experts to make informed decisions regarding the selection of cracking agents, their application techniques, and structural reinforcement methods. Ultimately, this research contributes to enhancing the safety, efficiency, and sustainability of demolition practices in the construction industry, reducing environmental impact and ensuring the protection of adjacent structures and the surrounding environment.Keywords: expansion pressure, energy release rate, soundless chemical demolition agent, crack propagation
Procedia PDF Downloads 631066 Practical Modelling of RC Structural Walls under Monotonic and Cyclic Loading
Authors: Reza E. Sedgh, Rajesh P. Dhakal
Abstract:
Shear walls have been used extensively as the main lateral force resisting systems in multi-storey buildings. The recent development in performance based design urges practicing engineers to conduct nonlinear static or dynamic analysis to evaluate seismic performance of multi-storey shear wall buildings by employing distinct analytical models suggested in the literature. For practical purpose, application of macroscopic models to simulate the global and local nonlinear behavior of structural walls outweighs the microscopic models. The skill level, computational time and limited access to RC specialized finite element packages prevents the general application of this method in performance based design or assessment of multi-storey shear wall buildings in design offices. Hence, this paper organized to verify capability of nonlinear shell element in commercially available package (Sap2000) in simulating results of some specimens under monotonic and cyclic loads with very oversimplified available cyclic material laws in the analytical tool. The selection of constitutive models, the determination of related parameters of the constituent material and appropriate nonlinear shear model are presented in detail. Adoption of proposed simple model demonstrated that the predicted results follow the overall trend of experimental force-displacement curve. Although, prediction of ultimate strength and the overall shape of hysteresis model agreed to some extent with experiment, the ultimate displacement(significant strength degradation point) prediction remains challenging in some cases.Keywords: analytical model, nonlinear shell element, structural wall, shear behavior
Procedia PDF Downloads 4071065 Methodology of Automation and Supervisory Control and Data Acquisition for Restructuring Industrial Systems
Authors: Lakhoua Najeh
Abstract:
Introduction: In most situations, an industrial system already existing, conditioned by its history, its culture and its context are in difficulty facing the necessity to restructure itself in an organizational and technological environment in perpetual evolution. This is why all operations of restructuring first of all require a diagnosis based on a functional analysis. After a presentation of the functionality of a supervisory system for complex processes, we present the concepts of industrial automation and supervisory control and data acquisition (SCADA). Methods: This global analysis exploits the various available documents on the one hand and takes on the other hand in consideration the various testimonies through investigations, the interviews or the collective workshops; otherwise, it also takes observations through visits as a basis and even of the specific operations. The exploitation of this diagnosis enables us to elaborate the project of restructuring thereafter. Leaving from the system analysis for the restructuring of industrial systems, and after a technical diagnosis based on visits, an analysis of the various technical documents and management as well as on targeted interviews, a focusing retailing the various levels of analysis has been done according a general methodology. Results: The methodology adopted in order to contribute to the restructuring of industrial systems by its participative and systemic character and leaning on a large consultation a lot of human resources that of the documentary resources, various innovating actions has been proposed. These actions appear in the setting of the TQM gait requiring applicable parameter quantification and a treatment valorising some information. The new management environment will enable us to institute an information and communication system possibility of migration toward an ERP system. Conclusion: Technological advancements in process monitoring, control and industrial automation over the past decades have contributed greatly to improve the productivity of virtually all industrial systems throughout the world. This paper tries to identify the principles characteristics of a process monitoring, control and industrial automation in order to provide tools to help in the decision-making process.Keywords: automation, supervision, SCADA, TQM
Procedia PDF Downloads 1791064 Ability of Gastric Enzyme Extract of Adult Camel to Clot Bovine Milk
Authors: Boudjenah-Haroun Saliha, Isselnane Souad, Nouani Abdelwahab, Baaissa Babelhadj, Mati Abderrahmane
Abstract:
Algeria is experiencing significant development of the dairy sector, where consumption of milk and milk products increased by 2.7 million tons in 2008 to 4,400,000 tons in 2013, and cheese production has reached 1640 tons in the year 2014 with average consumption of 0.7 kg/person/year. Although rennet is still the most used coagulating enzyme in cheese, its production has been growing worldwide shortage. This shortage is primarily due to a growing increase in the production and consumption of cheese, and the inability to increase in parallel the production of rennet. This shortage has caused very large fluctuations in its price). To overcome these obstacles, much research has been undertaken to find effective and competitive substitutes used industrially. For this, the selection of a local production of rennet substitute is desirable. It would allow a permanent supply with limited dependence on imports and price fluctuations. Investigations conducted by our research team showed that extracts coagulants from the stomachs of older camels are characterized by a coagulating power than those from younger camels. The objective of this work is to study the possibility of substituting commercial rennet coagulant by gastric enzymes from adult camels for coagulation bovine milk. Excerpts from the raw camel coagulants obtained are characterized through their teneures proteins and clotting and proteolytic activities. Milk clotting conditions by the action of these extracts were optimized. Milk clotting time all treated with enzyme preparations and under different conditions was calculated. Bovine rennet has been used for comparison. The results show that crude extracts from gastric adult camel can be good substituting bovine rennet.Keywords: Algeria, camel, cheese, coagulation, gastric extracts, milk
Procedia PDF Downloads 4411063 Integrated Risk Management in The Supply Chain of Essential Medicines in Zambia
Authors: Mario M. J. Musonda
Abstract:
Access to health care is a human right, which includes having timely access to affordable and quality essential medicines at the right place and in sufficient quantity. However, inefficient public sector supply chain management contributes to constant shortages of essential medicines at health facilities. Literature review involved a desktop study of published research studies and reports on risk management, supply chain management of essential medicines and their integration to increase the efficiency of the latter. The research was conducted on a sample population of offices under Ministry of Health Headquarters, Lusaka Provincial and District Offices, selected health facilities in Lusaka, Medical Stores Limited, Zambia Medicines Regulatory Authority and Cooperating Partners. Individuals involved in study were selected judgmentally by their functions under selection and quantification, regulation, procurement, storage, distribution, quality assurance, and dispensing of essential medicines. Structured interviews and discussions were held with selected experts and self-administered questionnaires were distributed. Collected and analysed data of 35 returned and usable questionnaires from the 50 distributed. The highest prioritised risks were; inadequate and inconsistent fund disbursements, weak information management systems, weak quality management systems and insufficient resources (HR and infrastructure) among others. The results for this research can be used to increase the efficiency of the public sector supply chain of essential medicines and other pharmaceuticals. The results of the study showed that there is need to implement effective risk management systems by participating institutions and organisations to increase the efficiency of the entire supply chain in order to avoid and/or reduce shortages of essential medicines at health facilities.Keywords: essential medicine, risk assessment, risk management, supply chain, supply chain risk management
Procedia PDF Downloads 4451062 A Method for Multimedia User Interface Design for Mobile Learning
Authors: Shimaa Nagro, Russell Campion
Abstract:
Mobile devices are becoming ever more widely available, with growing functionality, and are increasingly used as an enabling technology to give students access to educational material anytime and anywhere. However, the design of educational material user interfaces for mobile devices is beset by many unresolved research issues such as those arising from emphasising the information concepts then mapping this information to appropriate media (modelling information then mapping media effectively). This report describes a multimedia user interface design method for mobile learning. The method covers specification of user requirements and information architecture, media selection to represent the information content, design for directing attention to important information, and interaction design to enhance user engagement based on Human-Computer Interaction design strategies (HCI). The method will be evaluated by three different case studies to prove the method is suitable for application to different areas / applications, these are; an application to teach about major computer networking concepts, an application to deliver a history-based topic; (after these case studies have been completed, the method will be revised to remove deficiencies and then used to develop a third case study), an application to teach mathematical principles. At this point, the method will again be revised into its final format. A usability evaluation will be carried out to measure the usefulness and effectiveness of the method. The investigation will combine qualitative and quantitative methods, including interviews and questionnaires for data collection and three case studies for validating the MDMLM method. The researcher has successfully produced the method at this point which is now under validation and testing procedures. From this point forward in the report, the researcher will refer to the method using the MDMLM abbreviation which means Multimedia Design Mobile Learning Method.Keywords: human-computer interaction, interface design, mobile learning, education
Procedia PDF Downloads 2471061 Transformation of Hexagonal Cells into Auxetic in Core Honeycomb Furniture Panels
Authors: Jerzy Smardzewski
Abstract:
Structures with negative Poisson's ratios are called auxetic. They are characterized by better mechanical properties than conventional structures, especially shear strength, the ability to better absorb energy and increase strength during bending, especially in sandwich panels. Commonly used paper cores of cellular boards are made of hexagonal cells. With isotropic facings, these cells provide isotropic properties of the entire furniture board. Shelves made of such panels with a thickness similar to standard chipboards do not provide adequate stiffness and strength of the furniture. However, it is possible to transform the shape of hexagonal cells into polyhedral auxetic cells that improve the mechanical properties of the core. The work aimed to transform the hexagonal cells of the paper core into auxetic cells and determine their basic mechanical properties. Using numerical methods, it was decided to design the most favorable proportions of cells distinguished by the lowest Poisson's ratio and the highest modulus of linear elasticity. Standard cores for cellular boards commonly used to produce 34 mm thick furniture boards were used for the tests. Poisson's ratios, bending strength, and linear elasticity moduli were determined for such cores and boards. Then, the cells were transformed into auxetic structures, and analogous cellular boards were made for which mechanical properties were determined. The results of numerical simulations for which the variable parameters were the dimensions of the cell walls, wall inclination angles, and relative cell density were presented in the further part of the paper. Experimental tests and numerical simulations showed the beneficial effect of auxeticization on the mechanical quality of furniture panels. They allowed for the selection of the optimal shape of auxetic core cells.Keywords: auxetics, honeycomb, panels, simulation, experiment
Procedia PDF Downloads 141060 A 3D Numerical Environmental Modeling Approach For Assessing Transport of Spilled Oil in Porous Beach Conditions under a Meso-Scale Tank Design
Authors: J. X. Dong, C. J. An, Z. Chen, E. H. Owens, M. C. Boufadel, E. Taylor, K. Lee
Abstract:
Shorelines are vulnerable to significant environmental impacts from oil spills. Stranded oil can cause potential short- to long-term detrimental effects along beaches that include injuries to the ecosystem, socio-economic and cultural resources. In this study, a three-dimensional (3D) numerical modeling approach is developed to evaluate the fate and transport of spilled oil for hypothetical oiled shoreline cases under various combinations of beach geomorphology and environmental conditions. The developed model estimates the spatial and temporal distribution of spilled oil for the various test conditions, using the finite volume method and considering the physical transport (dispersion and advection), sinks, and sorption processes. The model includes a user-friendly interface for data input on variables such as beach properties, environmental conditions, and physical-chemical properties of spilled oil. An experimental mesoscale tank design was used to test the developed model for dissolved petroleum hydrocarbon within shorelines. The simulated results for effects of different sediment substrates, oil types, and shoreline features for the transport of spilled oil are comparable to those obtained with a commercially available model. Results show that the properties of substrates and the oil removal by shoreline effects have significant impacts on oil transport in the beach area. Sensitivity analysis, through the application of the one-step-at-a-time method (OAT), for the 3D model identified hydraulic conductivity as the most sensitive parameter. The 3D numerical model allows users to examine the behavior of oil on and within beaches, assess potential environmental impacts, and provide technical support for decisions related to shoreline clean-up operations.Keywords: dissolved petroleum hydrocarbons, environmental multimedia model, finite volume method, sensitivity analysis, total petroleum hydrocarbons
Procedia PDF Downloads 2181059 Magnetohemodynamic of Blood Flow Having Impact of Radiative Flux Due to Infrared Magnetic Hyperthermia: Spectral Relaxation Approach
Authors: Ebenezer O. Ige, Funmilayo H. Oyelami, Joshua Olutayo-Irheren, Joseph T. Okunlola
Abstract:
Hyperthermia therapy is an adjuvant procedure during which perfused body tissues is subjected to elevated range of temperature in bid to achieve improved drug potency and efficacy of cancer treatment. While a selected class of hyperthermia techniques is shouldered on the thermal radiations derived from single-sourced electro-radiation measures, there are deliberations on conjugating dual radiation field sources in an attempt to improve the delivery of therapy procedure. This paper numerically explores the thermal effectiveness of combined infrared hyperemia having nanoparticle recirculation in the vicinity of imposed magnetic field on subcutaneous strata of a model lesion as ablation scheme. An elaborate Spectral relaxation method (SRM) was formulated to handle equation of coupled momentum and thermal equilibrium in the blood-perfused tissue domain of a spongy fibrous tissue. Thermal diffusion regimes in the presence of external magnetic field imposition were described leveraging on the renowned Roseland diffusion approximation to delineate the impact of radiative flux within the computational domain. The contribution of tissue sponginess was examined using mechanics of pore-scale porosity over a selected of clinical informed scenarios. Our observations showed for a substantial depth of spongy lesion, magnetic field architecture constitute the control regimes of hemodynamics in the blood-tissue interface while facilitating thermal transport across the depth of the model lesion. This parameter-indicator could be utilized to control the dispensing of hyperthermia treatment in intravenous perfused tissue.Keywords: spectra relaxation scheme, thermal equilibrium, Roseland diffusion approximation, hyperthermia therapy
Procedia PDF Downloads 1191058 Transient Phenomena in a 100 W Hall Thrusters: Experimental Measurements of Discharge Current and Plasma Parameter Evolution
Authors: Clémence Royer, Stéphane Mazouffre
Abstract:
Nowadays, electric propulsion systems play a crucial role in space exploration missions due to their high specific impulse and long operational life. The Hall thrusters are one of the most mature EP technologies. It is a gridless ion thruster that has proved reliable and high-performance for decades in various space missions. Operation of HT relies on electron emissions through a cathode placed outside a hollow dielectric channel that includes an anode at the back. Negatively charged particles are trapped in a magnetic field and efficiently slow down. By collisions, the electron cloud ionizes xenon atoms. A large electric field is generated in the axial direction due to the low electron transverse mobility in the region of a strong magnetic field. Positive particles are pulled out of the chamber at high velocity and are neutralized directly at the exhaust area. This phenomenon leads to the acceleration of the spacecraft system at a high specific impulse. While HT’s architecture and operating principle are relatively simple, the physics behind thrust is complex and still partly unknown. Current and voltage oscillations, as well as electron properties, have been captured over a 30 mn time period after ignition. The observed low-frequency oscillations exhibited specific frequency ranges, amplitudes, and stability patterns. Correlations between the oscillations and plasma characteristics we analyzed. The impact of these instabilities on thruster performance, including thrust efficiency, has been evaluated as well. Moreover, strategies for mitigating and controlling these instabilities have been developed, such as filtering. In this contribution, in addition to presenting a summary of the results obtained in the transient regime, we will present and discuss recent advances in Hall thruster plasma discharge filtering and control.Keywords: electric propulsion, Hall Thruster, plasma diagnostics, low-frequency oscillations
Procedia PDF Downloads 911057 A Deluge of Disaster, Destruction, Death and Deception: Negative News and Empathy Fatigue in the Digital Age
Authors: B. N. Emenyeonu
Abstract:
Initially identified as sensationalism in the eras of yellow journalism and tabloidization, the inclusion of news which shocks or provokes strong emotional responses among readers, viewers, and browsers has not only remained a persistent feature of journalism but has also seemingly escalated in the current climate of digital and social media. Whether in the relentless revelation of scandals in high places, profiles on people displaced by sporadic wars or natural disasters, gruesome accounts of trucks plowing into pedestrians in a city centre, or the coverage of mourners paying tributes to victims of a mass shooting, mainstream, and digital media are often awash with tragedy, tears, and trauma. While it may aim at inspiring sympathy, outrage, or even remedial reactions, it would appear that the deluge of grief and misery in the news merely generates in the audience a feeling that borders on hearing or seeing too much to care or act. This feeling also appears to be accentuated by the dizzying diffusion of social media news and views, most of whose authenticity is not easily verifiable. Through a survey of 400 regular consumers of news and an in-depth interview of 10 news managers in selected media organizations across the Middle East, this study therefore investigates public attitude to the profusion of bad news in mainstream and digital media. Among other targets, it examines whether the profusion of bad news generates empathy fatigue among the audience and, if so, whether there is any association between biographic variables (profession, age, and gender) and an inclination to empathy fatigue. It also seeks to identify which categories of bad news and media are most likely to drag the audience into indifference. In conclusion, the study discusses the implications of the findings for mass-mediated advocacies such as campaigns against conflicts, corruption, nuclear threats, terrorism, gun violence, sexual crimes, and human trafficking, among other threats to humanity.Keywords: digital media, empathy fatigue, media campaigns, news selection
Procedia PDF Downloads 611056 Dielectric Response Analysis Measurement for Diagnostic Oil-Paper Insulation System on Aged Inter Bus Transformer 3x10 MVA
Authors: Eki Farlen, Akas
Abstract:
Condition assessment of oil-paper-insulated power transformers, particularly of water content, is becoming increasingly important for aged transformers. As insulation ages, it can produce water, which reduces its dielectric strength, accelerates the cellulose ageing process, and causes gas bubbles to form at high temperatures. This paper mainly assesses the life condition of oil-paper insulation system of Inter Bus Transformer (IBT) 30 MVA, 150/30 kV in PT PLN-Substation Jelok that has been operating for 41 years, since 1974. Valuable information about the condition of high voltage insulation may be obtained by measuring its dielectric response. This paper describes in detail the interpretation of Dielectric Response Analysis (DIRANA) measurements and the test result compared to other insulation tests to get deep information for diagnostic, such as Tan delta test, oil characteristic test and Dissolve Gas Analysis (DGA) test. This paper mainly discusses the parameter relationship between moisture content, water content, acidity, oil conductivity and dissipation factor. The result and analysis show that IBT 30 MVA Jelok phase U and W had just been ageing due to high acidity level (>0.2 mgKOH/g) which cause high moisture in cellulose/paper (%) are in wet category about 4.7% and 5% and water content in oil (ppm) about 3.13 ppm and 3.33 ppm at temperature 20°C. High acidity level can make oxidation process and produce water in paper and particle which can decrease the value of Interfacial Tension (IFT) below 22 mN/m (poor category) for both phase U and W. Even if paper insulation of transformer are in wet condition, dissipation factor and capacitance at the same frequency (50 Hz) from both measurement DIRANA test and Tangent delta test give the same result (almost), the results are 0.69% and 0.71% (<1%), it may be acceptable and should not be investigated. The DGA results show that TDCG are in level one (1) condition and there are no found a Key Gases, it means that transformers had no failure during operation like arching, partial discharge and thermal in oil or cellulose.Keywords: diagnostic, inter-bus transformer, oil-paper insulation, moisture, dissipation factor
Procedia PDF Downloads 2791055 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru
Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar
Abstract:
Nowadays, heritage building information modeling (HBIM) is considered an efficient tool to represent and manage information of cultural heritage (CH). The basis of this tool relies on a 3D model generally obtained from a cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired level of development (LOD), level of information (LOI), grade of generation (GOG), as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit, and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings, and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills, and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models families, respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI, and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources since the BIM software used has a free student license.Keywords: cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit
Procedia PDF Downloads 1451054 Body Types of Softball Players in the 39th National Games of Thailand
Authors: Nopadol Nimsuwan, Sumet Prom-in
Abstract:
The purpose of this study was to investigate the body types, size, and body compositions of softball players in the 39th National Games of Thailand. The population of this study was 352 softball players who participated in the 39th National Games of Thailand from which a sample size of 291 was determined using the Taro Yamane formula and selection is made with stratified sampling method. The data collected were weight, height, arm length, leg length, chest circumference, mid-upper arm circumference, calf circumference, subcutaneous fat in the upper arm area, the scapula bone area, above the pelvis area, and mid-calf area. Keys and Brozek formula was used to calculate the fat quantity, Kitagawa formula to calculate the muscle quantity, and Heath and Carter method was used to determine the values of body dimensions. The results of the study can be concluded as follows. The average body dimensions of the male softball players were the endo-mesomorph body type while the average body dimensions of female softball players were the meso-endomorph body type. When considered according to the softball positions, it was found that the male softball players in every position had the endo-mesomorph body type while the female softball players in every position had the meso-endomorph body type except for the center fielder that had the endo-ectomorph body type. The endo-mesomorph body type is suitable for male softball players, and the meso-endomorph body type is suitable for female softball players because these body types are suitable for the five basic softball skills which are: gripping, throwing, catching, hitting, and base running. Thus, people related to selecting softball players to play in sports competitions of different levels should consider factors in terms of body type, size, and body components of the players.Keywords: body types, softball players, national games of Thailand, social sustainability
Procedia PDF Downloads 4871053 Transformer-Driven Multi-Category Classification for an Automated Academic Strand Recommendation Framework
Authors: Ma Cecilia Siva
Abstract:
This study introduces a Bidirectional Encoder Representations from Transformers (BERT)-based machine learning model aimed at improving educational counseling by automating the process of recommending academic strands for students. The framework is designed to streamline and enhance the strand selection process by analyzing students' profiles and suggesting suitable academic paths based on their interests, strengths, and goals. Data was gathered from a sample of 200 grade 10 students, which included personal essays and survey responses relevant to strand alignment. After thorough preprocessing, the text data was tokenized, label-encoded, and input into a fine-tuned BERT model set up for multi-label classification. The model was optimized for balanced accuracy and computational efficiency, featuring a multi-category classification layer with sigmoid activation for independent strand predictions. Performance metrics showed an F1 score of 88%, indicating a well-balanced model with precision at 80% and recall at 100%, demonstrating its effectiveness in providing reliable recommendations while reducing irrelevant strand suggestions. To facilitate practical use, the final deployment phase created a recommendation framework that processes new student data through the trained model and generates personalized academic strand suggestions. This automated recommendation system presents a scalable solution for academic guidance, potentially enhancing student satisfaction and alignment with educational objectives. The study's findings indicate that expanding the data set, integrating additional features, and refining the model iteratively could improve the framework's accuracy and broaden its applicability in various educational contexts.Keywords: tokenized, sigmoid activation, transformer, multi category classification
Procedia PDF Downloads 131052 Conceptualizing Clashing Values in the Field of Media Ethics
Authors: Saadia Izzeldin Malik
Abstract:
Lack of ethics is the crisis of the 21-century. Today’s global world is filled with economic, political, environmental, media/communication, and social crises that all generated by the eroding fabric of ethics and moral values that guide human’s decisions in all aspects of live. Our global world is guided by liberal western democratic principles and liberal capitalist economic principles that define and reinforce each other. In economic terms, capitalism has turned world economic systems into one market place of ideas and products controlled by big multinational corporations that not only determine the conditions and terms of commodity production and commodity exchange between countries, but also transform the political economy of media systems around the globe. The citizen (read the consumer) today is the target of persuasion by all types of media at a time when her/his interests should be, ethically and in principle, the basic significant factor in the selection of media content. It is very important in this juncture of clashing media values –professional and commercial- and wide spread ethical lapses of media organizations and media professionals to think of a perspective to theorize these conflicting values within a broader framework of media ethics. Thus, the aim of this paper is to, epistemologically, bring to the center a perspective on media ethics as a basis for reconciliation of clashing values of the media. The paper focuses on conflicting ethical values in current media debate; namely ownership of media vs. press freedom, individual right for privacy vs. public right to know, and global western consumerism values vs. media values. The paper concludes that a framework to reconcile conflicting values of media ethics should focus on the “individual” journalist and his/her moral development as well as focus on maintaining ethical principles of the media as an institution with a primary social responsibility for the “public” it serves.Keywords: ethics, media, journalism, social responsibility, conflicting values, global
Procedia PDF Downloads 4951051 Slope Stability Analysis and Evaluation of Road Cut Slope in Case of Goro to Abagada Road, Adama
Authors: Ezedin Geta Seid
Abstract:
Slope failures are among the common geo-environmental natural hazards in the hilly and mountainous terrain of the world causing damages to human life and destruction of infrastructures. In Ethiopia, the demand for the construction of infrastructures, especially highways and railways, has increased to connect the developmental centers. However, the failure of roadside slopes formed due to the difficulty of geographical locations is the major difficulty for this development. As a result, a comprehensive site-specific investigation of destabilizing agents and a suitable selection of slope profiles are needed during design. Hence, this study emphasized the stability analysis and performance evaluation of slope profiles (single slope, multi-slope, and benched slope). The analysis was conducted for static and dynamic loading conditions using limit equilibrium (slide software) and finite element method (Praxis software). The analysis results in selected critical sections show that the slope is marginally stable, with FS varying from 1.2 to 1.5 in static conditions, and unstable with FS below 1 in dynamic conditions. From the comparison of analysis methods, the finite element method provides more valuable information about the failure surface of a slope than limit equilibrium analysis. Performance evaluation of geometric profiles shows that geometric modification provides better and more economical slope stability. Benching provides significant stability for cut slopes (i.e., the use of 2m and 3m bench improves the factor of safety by 7.5% and 12% from a single slope profile). The method is more effective on steep slopes. Similarly, the use of a multi-slope profile improves the stability of the slope in stratified soil with varied strength. The performance is more significant when it is used in combination with benches. The study also recommends drainage control and slope reinforcement as a remedial measure for cut slopes.Keywords: slope failure, slope profile, bench slope, multi slope
Procedia PDF Downloads 341050 Modelling the Impact of Installation of Heat Cost Allocators in District Heating Systems Using Machine Learning
Authors: Danica Maljkovic, Igor Balen, Bojana Dalbelo Basic
Abstract:
Following the regulation of EU Directive on Energy Efficiency, specifically Article 9, individual metering in district heating systems has to be introduced by the end of 2016. These directions have been implemented in member state’s legal framework, Croatia is one of these states. The directive allows installation of both heat metering devices and heat cost allocators. Mainly due to bad communication and PR, the general public false image was created that the heat cost allocators are devices that save energy. Although this notion is wrong, the aim of this work is to develop a model that would precisely express the influence of installation heat cost allocators on potential energy savings in each unit within multifamily buildings. At the same time, in recent years, a science of machine learning has gain larger application in various fields, as it is proven to give good results in cases where large amounts of data are to be processed with an aim to recognize a pattern and correlation of each of the relevant parameter as well as in the cases where the problem is too complex for a human intelligence to solve. A special method of machine learning, decision tree method, has proven an accuracy of over 92% in prediction general building consumption. In this paper, a machine learning algorithms will be used to isolate the sole impact of installation of heat cost allocators on a single building in multifamily houses connected to district heating systems. Special emphasises will be given regression analysis, logistic regression, support vector machines, decision trees and random forest method.Keywords: district heating, heat cost allocator, energy efficiency, machine learning, decision tree model, regression analysis, logistic regression, support vector machines, decision trees and random forest method
Procedia PDF Downloads 2521049 Optimization of Titanium Leaching Process Using Experimental Design
Authors: Arash Rafiei, Carroll Moore
Abstract:
Leaching process as the first stage of hydrometallurgy is a multidisciplinary system including material properties, chemistry, reactor design, mechanics and fluid dynamics. Therefore, doing leaching system optimization by pure scientific methods need lots of times and expenses. In this work, a mixture of two titanium ores and one titanium slag are used for extracting titanium for leaching stage of TiO2 pigment production procedure. Optimum titanium extraction can be obtained from following strategies: i) Maximizing titanium extraction without selective digestion; and ii) Optimizing selective titanium extraction by balancing between maximum titanium extraction and minimum impurity digestion. The main difference between two strategies is due to process optimization framework. For the first strategy, the most important stage of production process is concerned as the main stage and rest of stages would be adopted with respect to the main stage. The second strategy optimizes performance of more than one stage at once. The second strategy has more technical complexity compared to the first one but it brings more economical and technical advantages for the leaching system. Obviously, each strategy has its own optimum operational zone that is not as same as the other one and the best operational zone is chosen due to complexity, economical and practical aspects of the leaching system. Experimental design has been carried out by using Taguchi method. The most important advantages of this methodology are involving different technical aspects of leaching process; minimizing the number of needed experiments as well as time and expense; and concerning the role of parameter interactions due to principles of multifactor-at-time optimization. Leaching tests have been done at batch scale on lab with appropriate control on temperature. The leaching tank geometry has been concerned as an important factor to provide comparable agitation conditions. Data analysis has been done by using reactor design and mass balancing principles. Finally, optimum zone for operational parameters are determined for each leaching strategy and discussed due to their economical and practical aspects.Keywords: titanium leaching, optimization, experimental design, performance analysis
Procedia PDF Downloads 3751048 Hybridization of Manually Extracted and Convolutional Features for Classification of Chest X-Ray of COVID-19
Authors: M. Bilal Ishfaq, Adnan N. Qureshi
Abstract:
COVID-19 is the most infectious disease these days, it was first reported in Wuhan, the capital city of Hubei in China then it spread rapidly throughout the whole world. Later on 11 March 2020, the World Health Organisation (WHO) declared it a pandemic. Since COVID-19 is highly contagious, it has affected approximately 219M people worldwide and caused 4.55M deaths. It has brought the importance of accurate diagnosis of respiratory diseases such as pneumonia and COVID-19 to the forefront. In this paper, we propose a hybrid approach for the automated detection of COVID-19 using medical imaging. We have presented the hybridization of manually extracted and convolutional features. Our approach combines Haralick texture features and convolutional features extracted from chest X-rays and CT scans. We also employ a minimum redundancy maximum relevance (MRMR) feature selection algorithm to reduce computational complexity and enhance classification performance. The proposed model is evaluated on four publicly available datasets, including Chest X-ray Pneumonia, COVID-19 Pneumonia, COVID-19 CTMaster, and VinBig data. The results demonstrate high accuracy and effectiveness, with 0.9925 on the Chest X-ray pneumonia dataset, 0.9895 on the COVID-19, Pneumonia and Normal Chest X-ray dataset, 0.9806 on the Covid CTMaster dataset, and 0.9398 on the VinBig dataset. We further evaluate the effectiveness of the proposed model using ROC curves, where the AUC for the best-performing model reaches 0.96. Our proposed model provides a promising tool for the early detection and accurate diagnosis of COVID-19, which can assist healthcare professionals in making informed treatment decisions and improving patient outcomes. The results of the proposed model are quite plausible and the system can be deployed in a clinical or research setting to assist in the diagnosis of COVID-19.Keywords: COVID-19, feature engineering, artificial neural networks, radiology images
Procedia PDF Downloads 761047 Modeling of Bipolar Charge Transport through Nanocomposite Films for Energy Storage
Authors: Meng H. Lean, Wei-Ping L. Chu
Abstract:
The effects of ferroelectric nanofiller size, shape, loading, and polarization, on bipolar charge injection, transport, and recombination through amorphous and semicrystalline polymers are studied. A 3D particle-in-cell model extends the classical electrical double layer representation to treat ferroelectric nanoparticles. Metal-polymer charge injection assumes Schottky emission and Fowler-Nordheim tunneling, migration through field-dependent Poole-Frenkel mobility, and recombination with Monte Carlo selection based on collision probability. A boundary integral equation method is used for solution of the Poisson equation coupled with a second-order predictor-corrector scheme for robust time integration of the equations of motion. The stability criterion of the explicit algorithm conforms to the Courant-Friedrichs-Levy limit. Trajectories for charge that make it through the film are curvilinear paths that meander through the interspaces. Results indicate that charge transport behavior depends on nanoparticle polarization with anti-parallel orientation showing the highest leakage conduction and lowest level of charge trapping in the interaction zone. Simulation prediction of a size range of 80 to 100 nm to minimize attachment and maximize conduction is validated by theory. Attached charge fractions go from 2.2% to 97% as nanofiller size is decreased from 150 nm to 60 nm. Computed conductivity of 0.4 x 1014 S/cm is in agreement with published data for plastics. Charge attachment is increased with spheroids due to the increase in surface area, and especially so for oblate spheroids showing the influence of larger cross-sections. Charge attachment to nanofillers and nanocrystallites increase with vol.% loading or degree of crystallinity, and saturate at about 40 vol.%.Keywords: nanocomposites, nanofillers, electrical double layer, bipolar charge transport
Procedia PDF Downloads 3551046 Hybrid CNN-SAR and Lee Filtering for Enhanced InSAR Phase Unwrapping and Coherence Optimization
Authors: Hadj Sahraoui Omar, Kebir Lahcen Wahib, Bennia Ahmed
Abstract:
Interferometric Synthetic Aperture Radar (InSAR) coherence is a crucial parameter for accurately monitoring ground deformation and environmental changes. However, coherence can be degraded by various factors such as temporal decorrelation, atmospheric disturbances, and geometric misalignments, limiting the reliability of InSAR measurements (Omar Hadj‐Sahraoui and al. 2019). To address this challenge, we propose an innovative hybrid approach that combines artificial intelligence (AI) with advanced filtering techniques to optimize interferometric coherence in InSAR data. Specifically, we introduce a Convolutional Neural Network (CNN) integrated with the Lee filter to enhance the performance of radar interferometry. This hybrid method leverages the strength of CNNs to automatically identify and mitigate the primary sources of decorrelation, while the Lee filter effectively reduces speckle noise, improving the overall quality of interferograms. We develop a deep learning-based model trained on multi-temporal and multi-frequency SAR datasets, enabling it to predict coherence patterns and enhance low-coherence regions. This hybrid CNN-SAR with Lee filtering significantly reduces noise and phase unwrapping errors, leading to more precise deformation maps. Experimental results demonstrate that our approach improves coherence by up to 30% compared to traditional filtering techniques, making it a robust solution for challenging scenarios such as urban environments, vegetated areas, and rapidly changing landscapes. Our method has potential applications in geohazard monitoring, urban planning, and environmental studies, offering a new avenue for enhancing InSAR data reliability through AI-powered optimization combined with robust filtering techniques.Keywords: CNN-SAR, Lee Filter, hybrid optimization, coherence, InSAR phase unwrapping, speckle noise reduction
Procedia PDF Downloads 141045 Effect of Cryogenic Pre-stretching on the Room Temperature Tensile Behavior of AZ61 Magnesium Alloy and Dominant Grain Growth Mechanisms During Subsequent Annealing
Authors: Umer Masood Chaudry, Hafiz Muhammad Rehan Tariq, Chung-soo Kim, Tea-sung Jun
Abstract:
This study explored the influence of pre-stretching temperature on the microstructural characteristics and deformation behavior of AZ61 magnesium alloy and its implications on grain growth during subsequent annealing. AZ61 alloy was stretched to 5% plastic strain along rolling (RD) and transverse direction (TD) at room (RT) and cryogenic temperature (-150 oC, CT) followed by annealing at 320 oC for 1 h to investigate the twinning and dislocation evolution and its consequent effect on the flow stress, plastic strain and strain hardening rate. Compared to RT-stretched samples, significant improvement in yield stress, strain hardening rate and moderate reduction in elongation to failure were witnessed for CT-stretched samples along RD and TD. The subsequent EBSD analysis revealed the increased fraction of fine {10-12} twins and nucleation of multiple {10-12} twin variants caused by higher local stress concentration at the grain boundaries in CT-stretched samples as manifested by the kernel average misorientation. This higher twin fraction and twin-twin interaction imposed the strengthening by restricting the mean free path of dislocations, leading to higher flow stress and strain hardening rate. During annealing of the RT/CT-stretched samples, the residual strain energy and twin boundaries were decreased due to static recovery, leading to a coarse-grained twin-free microstructure. Strain induced boundary migration (SBIM) was found to be the predominant mechanism governing the grain growth during annealing via movement of high angle grain boundaries.Keywords: magnesium, twinning, twinning variant selection, EBSD, cryogenic deformation
Procedia PDF Downloads 681044 Globally Convergent Sequential Linear Programming for Multi-Material Topology Optimization Using Ordered Solid Isotropic Material with Penalization Interpolation
Authors: Darwin Castillo Huamaní, Francisco A. M. Gomes
Abstract:
The aim of the multi-material topology optimization (MTO) is to obtain the optimal topology of structures composed by many materials, according to a given set of constraints and cost criteria. In this work, we seek the optimal distribution of materials in a domain, such that the flexibility of the structure is minimized, under certain boundary conditions and the intervention of external forces. In the case we have only one material, each point of the discretized domain is represented by two values from a function, where the value of the function is 1 if the element belongs to the structure or 0 if the element is empty. A common way to avoid the high computational cost of solving integer variable optimization problems is to adopt the Solid Isotropic Material with Penalization (SIMP) method. This method relies on the continuous interpolation function, power function, where the base variable represents a pseudo density at each point of domain. For proper exponent values, the SIMP method reduces intermediate densities, since values other than 0 or 1 usually does not have a physical meaning for the problem. Several extension of the SIMP method were proposed for the multi-material case. The one that we explore here is the ordered SIMP method, that has the advantage of not being based on the addition of variables to represent material selection, so the computational cost is independent of the number of materials considered. Although the number of variables is not increased by this algorithm, the optimization subproblems that are generated at each iteration cannot be solved by methods that rely on second derivatives, due to the cost of calculating the second derivatives. To overcome this, we apply a globally convergent version of the sequential linear programming method, which solves a linear approximation sequence of optimization problems.Keywords: globally convergence, multi-material design ordered simp, sequential linear programming, topology optimization
Procedia PDF Downloads 3151043 An Invasive Lessepsian Species, Golden-Banded Goatfish, Upeneus Moluccensis Population from Iskenderun Bay, the Eastern Mediterranean Sea, Türkiye, With Some Biological Notes: The Effects of Climate Differences and Opening of Suez Canal
Authors: Hatice Torcu Koc, Zeliha Erdogan
Abstract:
This study presented the investigation of the population structure of Upeneus moluccensis in order to provide further knowledge and to compare the data with the studies before and thus help in the management of the population in the İskenderun Bay. For this purpose, a total of 370 golden-banded goatfish were caught by a commercial vessel monthly at a depth of 50-60 m. from İskenderun Bay in the years 2016-2018. Von Bertalanffy growth equation,length-weight relationships, sex ratio, age, condition, and gonado and hepato-somatic index values of U.peneus moluccensis specimens were determined. For this, the lengths and weights were measured using a dial caliper of 0.05 mm and a sensitive balance. Total lengths were 7.2–17.5 cm in females and 7.0–17.9 cm in males, while total weight ranges for females and males were 3.91-64.26 g and 3.69-60.95 g., respectively. Length-weight relationship for all individuals was calculated as W=0.004*L³ ³⁸, R²=0.85. Growth parameter was determined as L∞= 20.75 cm, k=0.33, t₀= - 0.56. The age readings were done by using the Bhattacharya method. The population was composed of 3 ages (1-3). The sex ratio was found as 1:1.42, corresponding to 41.4% males and 58.6% females, in favor of females (p<0.05). Values of the average condition and hepatosomatic index were found to be shown a similar pattern for males (1.088, 1.104) and females (1.124, 1.177), respectively. According to GSI values, the spawning period started in March and increased to April, May, and September. It was estimated that total (Z) mortality, natural (M) mortality, and fishing (F) mortality rates were estimated as Z=0.94 year-¹, M=0.033 year-¹, and F=0.63 year-¹, respectively. As the exploitation rate was estimated to be E=0.67, it can be shown that the golden-banded goatfish stock was influenced by overfishing. The findings of this study are very important to point out the population of U. moluccensis, which penetrated into the eastern Mediterranean Sea of Türkiye due to global heating and the construction of the Suez Canal and to be basic data for the next fisheries investigations.Keywords: biology, U. moluccensis, invasive, lessepsian, İskenderun Bay
Procedia PDF Downloads 74