OWASP top 10 – 2021

There are three new categories, four categories with naming and scoping changes, and some consolidation in the Top 10 for 2021. We’ve changed names when necessary to focus on the root cause over the symptom.

Mapping

  • A01:2021-Broken Access Control moves up from the fifth position to the category with the most serious web application security risk; the contributed data indicates that on average, 3.81% of applications tested had one or more Common Weakness Enumerations (CWEs) with more than 318k occurrences of CWEs in this risk category. The 34 CWEs mapped to Broken Access Control had more occurrences in applications than any other category.
  • A02:2021-Cryptographic Failures shifts up one position to #2, previously known as A3:2017-Sensitive Data Exposure, which was broad symptom rather than a root cause. The renewed name focuses on failures related to cryptography as it has been implicitly before. This category often leads to sensitive data exposure or system compromise.
  • A03:2021-Injection slides down to the third position. 94% of the applications were tested for some form of injection with a max incidence rate of 19%, an average incidence rate of 3.37%, and the 33 CWEs mapped into this category have the second most occurrences in applications with 274k occurrences. Cross-site Scripting is now part of this category in this edition.
  • A04:2021-Insecure Design is a new category for 2021, with a focus on risks related to design flaws. If we genuinely want to “move left” as an industry, we need more threat modeling, secure design patterns and principles, and reference architectures. An insecure design cannot be fixed by a perfect implementation as by definition, needed security controls were never created to defend against specific attacks.
  • A05:2021-Security Misconfiguration moves up from #6 in the previous edition; 90% of applications were tested for some form of misconfiguration, with an average incidence rate of 4.5%, and over 208k occurrences of CWEs mapped to this risk category. With more shifts into highly configurable software, it’s not surprising to see this category move up. The former category for A4:2017-XML External Entities (XXE) is now part of this risk category.
  • A06:2021-Vulnerable and Outdated Components was previously titled Using Components with Known Vulnerabilities and is #2 in the Top 10 community survey, but also had enough data to make the Top 10 via data analysis. This category moves up from #9 in 2017 and is a known issue that we struggle to test and assess risk. It is the only category not to have any Common Vulnerability and Exposures (CVEs) mapped to the included CWEs, so a default exploit and impact weights of 5.0 are factored into their scores.
  • A07:2021-Identification and Authentication Failures was previously Broken Authentication and is sliding down from the second position, and now includes CWEs that are more related to identification failures. This category is still an integral part of the Top 10, but the increased availability of standardized frameworks seems to be helping.
  • A08:2021-Software and Data Integrity Failures is a new category for 2021, focusing on making assumptions related to software updates, critical data, and CI/CD pipelines without verifying integrity. One of the highest weighted impacts from Common Vulnerability and Exposures/Common Vulnerability Scoring System (CVE/CVSS) data mapped to the 10 CWEs in this category. A8:2017-Insecure Deserialization is now a part of this larger category.
  • A09:2021-Security Logging and Monitoring Failures was previously A10:2017-Insufficient Logging & Monitoring and is added from the Top 10 community survey (#3), moving up from #10 previously. This category is expanded to include more types of failures, is challenging to test for, and isn’t well represented in the CVE/CVSS data. However, failures in this category can directly impact visibility, incident alerting, and forensics.
  • A10:2021-Server-Side Request Forgery is added from the Top 10 community survey (#1). The data shows a relatively low incidence rate with above average testing coverage, along with above-average ratings for Exploit and Impact potential. This category represents the scenario where the security community members are telling us this is important, even though it’s not illustrated in the data at this time.

cisco anyconnect client profile sample

<?xml version=”1.0″ encoding=”UTF-8″?>
<AnyConnectProfile xmlns=”http://schemas.xmlsoap.org/encoding/” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xsi:schemaLocation=”http://schemas.xmlsoap.org/encoding/ AnyConnectProfile.xsd”>
<ClientInitialization>
<UseStartBeforeLogon UserControllable=”true”>true</UseStartBeforeLogon>
<AutomaticCertSelection UserControllable=”false”>true</AutomaticCertSelection>
<ShowPreConnectMessage>false</ShowPreConnectMessage>
<CertificateStore>All</CertificateStore>
<CertificateStoreMac>All</CertificateStoreMac>
<CertificateStoreLinux>All</CertificateStoreLinux>
<CertificateStoreOverride>true</CertificateStoreOverride>
<ProxySettings>Native</ProxySettings>
<AllowLocalProxyConnections>true</AllowLocalProxyConnections>
<AuthenticationTimeout>30</AuthenticationTimeout>
<AutoConnectOnStart UserControllable=”true”>false</AutoConnectOnStart>
<MinimizeOnConnect UserControllable=”true”>true</MinimizeOnConnect>
<LocalLanAccess UserControllable=”true”>false</LocalLanAccess>
<DisableCaptivePortalDetection UserControllable=”true”>false</DisableCaptivePortalDetection>
<ClearSmartcardPin UserControllable=”false”>true</ClearSmartcardPin>
<IPProtocolSupport>IPv4,IPv6</IPProtocolSupport>
<AutoReconnect UserControllable=”false”>true
<AutoReconnectBehavior UserControllable=”false”>ReconnectAfterResume</AutoReconnectBehavior>
</AutoReconnect>
<SuspendOnConnectedStandby>false</SuspendOnConnectedStandby>
<AutoUpdate UserControllable=”false”>true</AutoUpdate>
<RSASecurIDIntegration UserControllable=”false”>Automatic</RSASecurIDIntegration>
<WindowsLogonEnforcement>SingleLocalLogon</WindowsLogonEnforcement>
<LinuxLogonEnforcement>SingleLocalLogon</LinuxLogonEnforcement>
<WindowsVPNEstablishment>LocalUsersOnly</WindowsVPNEstablishment>
<LinuxVPNEstablishment>LocalUsersOnly</LinuxVPNEstablishment>
<AutomaticVPNPolicy>false</AutomaticVPNPolicy>
<PPPExclusion UserControllable=”false”>Disable
<PPPExclusionServerIP UserControllable=”false”></PPPExclusionServerIP>
</PPPExclusion>
<EnableScripting UserControllable=”false”>false</EnableScripting>
<EnableAutomaticServerSelection UserControllable=”false”>false
<AutoServerSelectionImprovement>20</AutoServerSelectionImprovement>
<AutoServerSelectionSuspendTime>4</AutoServerSelectionSuspendTime>
</EnableAutomaticServerSelection>
<RetainVpnOnLogoff>false
</RetainVpnOnLogoff>
<CaptivePortalRemediationBrowserFailover>false</CaptivePortalRemediationBrowserFailover>
<AllowManualHostInput>true</AllowManualHostInput>
</ClientInitialization>
</AnyConnectProfile>

Generating a billion records in Cassandra

Generating a billion records in Cassandra can be accomplished using various methods, including using scripts, data generation tools, or custom applications. Below are some approaches you can take to generate a large dataset for testing or benchmarking purposes.
Method 1: Using CQLSH with a Python Script
You can use a Python script to generate and insert a billion records into Cassandra. This method uses the cassandra-driver library to connect to your Cassandra cluster.
Prerequisites
Install Cassandra Driver for Python:
Make sure you have the Cassandra driver installed. You can install it using pip:
pip install cassandra-driver
2. Set Up Your Cassandra Keyspace and Table:
Create a keyspace and a table in Cassandra where you will insert the records.
CREATE KEYSPACE test_keyspace WITH REPLICATION = { 'class': 'SimpleStrategy', 'replication_factor': 1 };

CREATE TABLE test_keyspace.test_table (
id UUID PRIMARY KEY,
name TEXT,
age INT
);

Python Script to Generate Records
Here’s a sample Python script that generates and inserts a billion records:
from cassandra.cluster import Cluster
import uuid
import random

# Connect to Cassandra
cluster = Cluster(['127.0.0.1']) # Replace with your Cassandra node IP
session = cluster.connect('test_keyspace')

# Prepare the insert statement
insert_stmt = session.prepare("INSERT INTO test_table (id, name, age) VALUES (?, ?, ?)")

# Generate and insert records
for i in range(1, 1000000001): # 1 billion records
record_id = uuid.uuid4()
name = f"Name_{i}"
age = random.randint(18, 99)

session.execute(insert_stmt, (record_id, name, age))

if i % 100000 == 0: # Print progress every 100,000 records
print(f"Inserted {i} records")

# Close the session and cluster connection
session.shutdown()
cluster.shutdown()

Method 2: Using Apache Spark
If you have Apache Spark set up, you can use it to generate and insert a large number of records into Cassandra efficiently.
Prerequisites
1. Set Up Spark with Cassandra Connector:
Make sure you have the Spark Cassandra Connector. You can include it in your Spark job using the following Maven dependency:

com.datastax.spark
spark-cassandra-connector_2.12
3.1.0

Spark Job to Generate Records
Here’s a sample Spark job in Scala to generate and insert records:

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.cassandra._

object GenerateRecords {
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder()
.appName("Generate Records")
.config("spark.cassandra.connection.host", "127.0.0.1") // Replace with your Cassandra node IP
.getOrCreate()

import spark.implicits._

// Generate a DataFrame with 1 billion records
val records = (1 to 1000000000).map(i => (java.util.UUID.randomUUID(), s"Name_$i", scala.util.Random.nextInt(82) + 18))
val df = records.toDF("id", "name", "age")

// Write to Cassandra
df.write
.format("org.apache.spark.sql.cassandra")
.options(Map("keyspace" -> "test_keyspace", "table" -> "test_table"))
.mode("append")
.save()

spark.stop()
}
}

Method 3: Using Data Generation Tools
You can also use data generation tools like:
Apache JMeter: You can create a test plan to generate data and insert it into Cassandra.
Mockaroo: A web-based tool that allows you to generate large datasets in various formats, including CSV, which you can then import into Cassandra.
Conclusion
Generating a billion records in Cassandra can be done using various methods, including Python scripts, Apache Spark, or data generation tools. Choose the method that best fits your environment and requirements. Always ensure that your Cassandra cluster is properly configured to handle the load, and monitor performance during the data generation process.

cisco fpr1140 ftd smart license issue

If you got one notification as follow:

Product Instance Failed to Connect – The device “UDI_PID:FPR-1140; UDI_SN:xxxxxxxxxxxx; ” in the virtual account “DEFAULT” has not connected within its renewal period, and may run in a degraded state if it does not connect within the next 1 day. If the device is not going to connect, you can remove it to immediately release the licenses it is consuming.

try login system run command:

 

root@fw:/home/admin# pmtool restartbyid tomcat
root@fw:/home/admin# pmtool status | grep "tomcat"
then wait a quite long time to the http portal recover to access

 

squid

acl localnet src 192.168.0.0/16 # RFC 1918 local private network (LAN)
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines

acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http

http_access deny !Safe_ports

http_access deny CONNECT !SSL_ports

http_access allow localhost manager
http_access deny manager

http_access allow localhost
acl hasRequest has request
logformat custom_log %{%Y-%m-%d %H:%M:%S}tl %>a:%>p %Ss/%03>Hs:%Sh “%rm %ru HTTP/%rv” %mt %>Hs %<st %tr “%{User-Agent}>h” “%{Referer}>h”
access_log /var/log/squid/access.log custom_log hasRequest

http_access deny to_localhost

http_access deny to_linklocal

http_access allow localnet
http_access deny all
http_port 3128
maximum_object_size 1 GB
cache_dir ufs /var/spool/squid 10240 16 256
cache_mem 256 MB
maximum_object_size_in_memory 4 MB
cache_replacement_policy heap LFUDA
range_offset_limit -1
quick_abort_min -1 KB

coredump_dir /var/spool/squid

refresh_pattern -i \.7z$ 300 90% 14320 reload-into-ims
refresh_pattern -i \.x03$ 300 90% 14320 reload-into-ims
refresh_pattern -i \.m30$ 300 90% 14320 reload-into-ims
refresh_pattern -i \.m35$ 300 90% 14320 reload-into-ims
refresh_pattern -i \.zip$ 300 90% 14320 reload-into-ims
refresh_pattern -i \.irn$ 300 90% 14320 reload-into-ims
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i symantecliveupdate.com/.* 1440 90% 43200
refresh_pattern -i symantec.com/.* 1440 90% 43200
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
httpd_suppress_version_string on
via off

powershell apache log filter

Select-String “404” .\localhost_access_log.2024-08-07.txt | % {($_.line.split(‘HTTP’))[1]} | Sort-Object | Get-Unique

gc ‘.\localhost_access_log.2024-08-07.txt’ -Wait

Get-Process | sort {[string]$_.ID}