HTML pages referred static files (images, js and css etc) can be cached in your browser by setting HTTP response header attributes about cache.
Two main types of cache headers, cache-control and expires, define the caching characteristics for your resources. Typically, cache-control is considered a more modern and flexible approach than expires, but both headers can be used simultaneously.
Cache headers are applied to resources at the server level – for example, in the .htaccess file on an Apache server, used by nearly half of all active websites – to set their caching characteristics. Caching is enabled by identifying a resource or type of resource, such as images or CSS files, and then specifying headers for the resource(s) with the desired caching options.
Stop using (HTTP 1.0)
Replaced with (HTTP 1.1 since 1999)
Expires: [date]
Cache-Control: max-age=[seconds]
Pragma: no-cache
Cache-Control: no-cache
Setting HTTP cache in Spring framework
Setting HTTP response cache-control header in Spring framework
Only set HTTP Cache-Control header for HTTP directly response by Nginx, not proxy_pass server HTTP responses. In other words, request static files via the server’s file path. For example:
Note that no-cache does not mean “don’t cache”. no-cache allows caches to store a response but requires them to revalidate it before reuse. If the sense of “don’t cache” that you want is actually “don’t store”, then no-store is the directive to use.
You can edit HTML on the fly and preview the changes by selecting any element, choosing a DOM element within the panel, and double clicking on the opening tag to edit it.
Edit CSS property
you can also change CSS in Chrome DevTools and preview what the result will look like. This is probably one of the most common uses for this tool. Simply select the element you want to edit and under the styles panel you can add/change any CSS property you want.
Change color format
You can toggle between RGBA, HSL, and hexadecimal formatting by pressing Shift + Click on the color block.
Console
Design Mode
you can freely make edits to the page as if it were a document.
Open design mode: document.designMode = “on”
Monitoring events on-page elements
monitorEvents($0, ‘mouse’)
Sources
Pretty print
You can easily change the formatting of your minimized code by clicking on {}.
Multiple cursors
You can easily add multiple cursors by pressing Cmd + Click (Ctrl + Click) and entering information on multiple lines at the same time.
Search source code
You can quickly search all of your source code by pressing Cmd + Opt + F (Ctrl + Shift + F).
Network
Without Cache When Access Web Page
Checked “Disable cache”
Others
Dock Position
You can also change the Chrome DevTools dock position. You can either undock into a separate window, or dock it on the left, bottom, or right side of the browser. The dock position can be changed by pressing Cmd + Shift + D (Ctrl + Shift + D) or through the menu.
Spring Boot uses a very particular PropertySource order that is designed to allow sensible overriding of values. Properties are considered in the following order (with values from lower items overriding earlier ones):
Default properties (specified by setting SpringApplication.setDefaultProperties).
@PropertySource annotations on your @Configuration classes. Please note that such property sources are not added to the Environment until the application context is being refreshed. This is too late to configure certain properties such as logging.* and spring.main.* which are read before refresh begins.
Config data (such as application.properties files).
Application properties packaged inside your jar (application.properties and YAML variants).
If you add new OS Environment Variables on Windows, you must restart your processes (Java process, Intellij IDEA) to read the new OS Environment Variables.
For any other Windows executable, system-level changes to the environment variables are only propagated to the process when it is restarted.
Add User variables or System variables on Linux or Windows
msg=hello
Read by System Class
System.getenv("msg")
Read by Environment object
@Autowired private Environment environment;
environment.getProperty("msg")
Injecting environment variables
@Value("${msg}") private String msg;
Setting application.properties values from environment
msg=${msg}
JSON Application Properties
Environment variables and system properties often have restrictions that mean some property names cannot be used. To help with this, Spring Boot allows you to encode a block of properties into a single JSON structure.
When your application starts, any spring.application.json or SPRING_APPLICATION_JSON properties will be parsed and added to the Environment.
For example, the SPRING_APPLICATION_JSON property can be supplied on the command line in a UN*X shell as an environment variable:
If you are deploying to a classic Application Server, you could also use a JNDI variable named java:comp/env/spring.application.json.
Accessing Command Line Properties
By default, SpringApplication converts any command line option arguments (that is, arguments starting with --, such as --server.port=9000) to a property and adds them to the Spring Environment. As mentioned previously, command line properties always take precedence over file-based property sources.
If you do not want command line properties to be added to the Environment, you can disable them by using SpringApplication.setAddCommandLineProperties(false).
Warning: to convert a object value to a string value you can use Objects.toString(object) or object != null ? object.toString() : null. but not String.valueOf() and toString(). The result of String.valueOf(null) is “null” not null. If the object value is null, calling toString() will occur NullPointerExcpetion.
Swap position of elements and create new collection copy from updated old collection. Don’t reorganize the collection.
T(n) = O(n), S(n) = O(n)
ISBN isbn = new ISBN("0-201-63361-2"); List<Book> found = new ArrayList<>(); for(Book book : books){ if(book.getIsbn().equals(isbn)){ found.add(book); } } books.removeAll(found);
1.2 Collect indexes and remove one by one
T(n) = O(n), S(n) = O(m * n)
1.3 Collect objects set and remove one by one
T(n) = O(n), S(n) = O(m * n)
Using iterator to remove in loop
Iterator using the cursor variable to traverse collection and remove by index of collection. If you remove a element, the cursor will update correctly. Iterator like forEach, but it index is not from 0 to size-1 of collection. The every remove operations will creating a new collection that copy from updated old collection.
T(n) = O(n), S(n) = O(m * n)
ListIterator<Book> iter = books.listIterator(); while(iter.hasNext()){ if(iter.next().getIsbn().equals(isbn)){ iter.remove(); } }
removeIf() method (JDK 8)
Swap position of elements, set new size for collection, and set null for between new size to old size elements.
T(n) = O(n), S(n) = O(1)
ISBN other = new ISBN("0-201-63361-2"); books.removeIf(b -> b.getIsbn().equals(other));
Using filter of Stream API (JDK 8)
Creating new collection. Traversing has no order.
T(n) = O(n), S(n) = O(n) guess by “A stream does not store data and, in that sense, is not a data structure. It also never modifies the underlying data source.”
ISBN other = new ISBN("0-201-63361-2"); List<Book> filtered = books.stream() .filter(b -> b.getIsbn().equals(other)) .collect(Collectors.toList());
Recommend: removeIf() > stream().filter() or parallelStream()> Collect objects set and removeAll() > Using iterator, or Collect indexes and remove one by one, or Collect objects set and remove one by one.
Deduplication
Deduplicate values
Deduplicate values by stream distinct()
List<Integer> list = Arrays.asList(1, 2, 3, 2, 3, 4); list = list.stream().distinct().collect(Collectors.toList()); System.out.println(list);
Deduplicate values by creating a set
List<Integer> list = newArrayList<>(Arrays.asList(1, 2, 3, 2, 3, 4)); Set<Integer> set = newLinkedHashSet<>(list); list.clear(); // note: clear Arrays.asList will throw UnsupportedOperationException list.addAll(set); System.out.println(list);
List<Integer> list = Arrays.asList(1, 2, 3, 2, 3, 4); list = newArrayList<>(newLinkedHashSet<>(list)); System.out.println(list);
Deduplicate objects by property
Deduplicate by stream
List<User> list = list.stream().collect(Collectors.toMap(User::getName, Function.identity(), (p, q) -> p, LinkedHashMap::new)).values();
Deduplicate objects by removing in for loop
List<User> userList = buildUserList(); System.out.println("Before: " + userList); Iterator<User> i = userList.iterator(); while (i.hasNext()) { Useruser= i.next(); if (user.getUserName().contains("test")) { i.remove(); } } System.out.println("After: " + userList);
Deduplicate objects by finding duplicate objects and then removing all of it
List<User> userList = buildUserList(); System.out.println("Before: " + userList); List<User> toRemoveList = newArrayList<>(); for (User user : userList) { if (user.getUserName().contains("test")) { toRemoveList.add(user); } } userList.removeAll(toRemoveList); System.out.println("After: " + userList);
Only one consecutive repeated element is retained
List<IdName> list = newArrayList<>(); list.add(newIdName(1, "a")); list.add(newIdName(2, "a")); list.add(newIdName(3, "a")); list.add(newIdName(4, "b")); list.add(newIdName(5, "b")); list.add(newIdName(6, "c")); List<Integer> indexToRemove = newArrayList<>(); for (inti=0; i < list.size(); i++) { if (i < list.size() - 1 && list.get(i).getName().equals(list.get(i + 1).getName())) { indexToRemove.add(i); } } for (inti= indexToRemove.size() - 1; i >= 0; i--) { list.remove(indexToRemove.get(i).intValue()); } System.out.println(list);
Summary: if you don’t need to keep collections always be ordered, you just use Collections sort() to get sorted collections.
Compare object list using Collections.sort(objectList)
public class Animal implements Comparable<Animal> { private String name; @Override public int compareTo(Animal o) { return this.name.compareTo(o.name); } }
List<Animal> list = newArrayList<>(); Collections.sort(list); Collections.sort(list, Collections.reverseOrder());
// Partition students into passing and failing Map<Boolean, List<Student>> passingFailing = students.stream() .collect(Collectors.partitioningBy(s -> s.getGrade() >= PASS_THRESHOLD));
Aggregation
maxBy()
minBy()
averagingInt()
summingInt()
counting()
// Compute sum of salaries by department Map<Department, Integer> totalByDept = employees.stream() .collect(Collectors.groupingBy(Employee::getDepartment, Collectors.summingInt(Employee::getSalary)));
// keep sorted when group. Using `TreeMap::new` or `() -> new TreeMap<>()` Map<String, Double> averageAgeByType = userList .stream() .collect(Collectors.groupingBy(User::getType, TreeMap::new, Collectors.averagingInt(User::getAge)));
// sort list before group and keep insertion order when group Map<String, Double> userAverageAgeMap2 = userList .stream() .sorted(Comparator.comparing(User::getType)) .collect(Collectors.groupingBy(User::getType, LinkedHashMap::new, Collectors.averagingInt(User::getAge)));
When I execute a SQL script of the dump of a database structure and data, occurs an error “The user specified as a definer 'xxx'@'%' does not exist”.
Error Info
SQL Error (1449):The user specified as a definer ('xxx'@'%') does not exist
Solutions
This commonly occurs when exporting views/triggers/procedures from one database or server to another as the user that created that object no longer exists.
For example, the following is a trigger create statement:
CREATE DEFINER=`not_exist_user` TRIGGER your_trigger BEFORE INSERTON your_table FOREACHROWSET new.create_time=NOW();
Solution 1: Change the DEFINER
This is possibly easiest to do when initially importing your database objects, by removing any DEFINER statements DEFINER=some_user from the dump.
Changing the definer later is a more little tricky. You can search solutions for “How to change the definer for views/triggers/procedures”.
Solution 2: Create the missing user
If you’ve found following error while using MySQL database:
The user specified as a definer ('some_user'@'%') does not exist`
Then you can solve it by using following :
CREATEUSER'some_user'@'%' IDENTIFIED BY'complex-password'; GRANTALLON*.*TO'some_user'@'%' IDENTIFIED BY'complex-password'; /* or GRANT ALL ON *.* TO 'some_user'@'%'; */ FLUSH PRIVILEGES;
Reasons
My exported trigger has a definer user that does not exist.
When you insert data into the table used in the trigger, MySQL will occur the error “The user specified as a definer xxx does not exist”.
The default maximum body size of a client request, or maximum file size, that Nginx allows you to have, is 1M. So when you try to upload something larger than 1M, you get the following error: 413: Request Entity Too Large.
When over the max upload file size
When uploading a file over max size, Nginx returns
status code: 413 Request Entity Too Large
Content-Type: text/html
response body:
<html> <head><title>413 Request Entity Too Large</title></head> <body> <center><h1>413 Request Entity Too Large</h1></center> <hr><center>nginx/1.18.0</center> </body> </html>
Solutions
Add the following settings to your Nginx configuration file nginx.conf
max-file-size specifies the maximum size permitted for uploaded files. The default is 1MB
max-request-size specifies the maximum size allowed for multipart/form-data requests. The default is 10MB.
When over the max upload file size
The Java web project will throw the IllegalStateException
- UT005023: Exception handling request to /file/uploadFile java.lang.IllegalStateException: io.undertow.server.handlers.form.MultiPartParserDefinition$FileTooLargeException: UT000054: The maximum size 1048576 for an individual file in a multipart request was exceeded at io.undertow.servlet.spec.HttpServletRequestImpl.parseFormData(HttpServletRequestImpl.java:847) at io.undertow.servlet.spec.HttpServletRequestImpl.getParameter(HttpServletRequestImpl.java:722) at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:85) ...
Solutions
Add the following settings in your spring boot configuration file application.yml
spring: servlet: multipart: # max single file size max-file-size:100MB # max request size max-request-size:200MB
Calling backend API, the status code of the response is 500, but the backend is not throw exceptions. The HTTP response message is “Proxy error: Could not proxy request”.
Error Info
Proxy error: Could not proxy request /captchaImage from localhost:8070 to http://10.0.0.74:8090 (ECONNREFUSED).
Solutions
Make sure the config devServer.proxy.target is correct.
JUnit 5 leverages features from Java 8 or later, such as lambda functions, making tests more powerful and easier to maintain.
JUnit 5 has added some very useful new features for describing, organizing, and executing tests. For instance, tests get better display names and can be organized hierarchically.
JUnit 5 is organized into multiple libraries, so only the features you need are imported into your project. With build systems such as Maven and Gradle, including the right libraries is easy.
JUnit 5 can use more than one extension at a time, which JUnit 4 could not (only one runner could be used at a time). This means you can easily combine the Spring extension with other extensions (such as your own custom extension).
JUnit 5 assertions are now in org.junit.jupiter.api.Assertions. Most of the common assertions, such as assertEquals() and assertNotNull(), look the same as before, but there are a few differences:
The error message is now the last argument, for example: assertEquals("my message", 1, 2) is now assertEquals(1, 2, "my message").
Most assertions now accept a lambda that constructs the error message, which is called only when the assertion fails.
assertTimeout() and assertTimeoutPreemptively() have replaced the @Timeout annotation (there is an @Timeout annotation in JUnit 5, but it works differently than in JUnit 4).
There are several new assertions, described below.
Note that you can continue to use assertions from JUnit 4 in a JUnit 5 test if you prefer.
Assumptions
Executes the supplied Executable, but only if the supplied assumption is valid.
JUnit 4
assumeThat("alwaysPasses", 1, is(1)); // passes foo(); // will execute assumeThat("alwaysFails", 0, is(1)); // assumption failure! test halts intx=1 / 0; // will never execute
@RunWith(SpringRunner.class)// SpringRunner is an alias for the SpringJUnit4ClassRunner. //@RunWith(SpringJUnit4ClassRunner.class) publicclassMyControllerTest { // ... }
To convert an existing JUnit 4 test to JUnit 5, use the following steps, which should work for most tests:
Update imports to remove JUnit 4 and add JUnit 5. For instance, update the package name for the @Test annotation, and update both the package and class name for assertions (from Asserts to Assertions). Don’t worry yet if there are compilation errors, because completing the following steps should resolve them.
Globally replace old annotations and class names with new ones. For example, replace all @Before with @BeforeEach and all Asserts with Assertions.
Update assertions; any assertions that provide a message need to have the message argument moved to the end (pay special attention when all three arguments are strings!). Also, update timeouts and expected exceptions (see above for examples).
Update assumptions if you are using them.
Replace any instances of @RunWith, @Rule, or @ClassRule with the appropriate @ExtendWith annotations. You may need to find updated documentation online for the extensions you’re using for examples.
New Features
Display Names
you can add the @DisplayName annotation to classes and methods. The name is used when generating reports, which makes it easier to describe the purpose of tests and track down failures, for example:
assertAll() groups multiple assertions together. Asserts that all supplied executables do not throw exceptions. The added benefit is that all assertions are performed, even if individual assertions fail.
voidassertAll(Executable... executables)
assertThrows() and assertDoesNotThrow() have replaced the expected property in the @Test annotation.
<T extendsThrowable> T assertThrows(Class<T> expectedType, Executable executable)
voidassertDoesNotThrow(Executable executable)
Nested tests
Test suites in JUnit 4 were useful, but nested tests in JUnit 5 are easier to set up and maintain, and they better describe the relationships between test groups.
Parameterized tests
Test parameterization existed in JUnit 4, with built-in libraries such as JUnit4Parameterized or third-party libraries such as JUnitParams. In JUnit 5, parameterized tests are completely built in and adopt some of the best features from JUnit4Parameterized and JUnitParams, for example:
JUnit 5 provides the ExecutionCondition extension API to enable or disable a test or container (test class) conditionally. This is like using @Disabled on a test but it can define custom conditions. There are multiple built-in conditions, such as these:
@EnabledOnOs and @DisabledOnOs: Enables or disables a test only on specified operating systems
@EnabledOnJre and @DisabledOnJre: Specifies the test should be enabled or disabled for particular versions of Java
@EnabledIfSystemProperty: Enables a test based on the value of a JVM system property
@EnabledIf: Uses scripted logic to enable a test if scripted conditions are met
Test templates
Test templates are not regular tests; they define a set of steps to perform, which can then be executed elsewhere using a specific invocation context. This means that you can define a test template once, and then build a list of invocation contexts at runtime to run that test with. For details and examples, see the documentation.
Dynamic tests
Dynamic tests are like test templates; the tests to run are generated at runtime. However, while test templates are defined with a specific set of steps and run multiple times, dynamic tests use the same invocation context but can execute different logic. One use for dynamic tests would be to stream a list of abstract objects and perform a separate set of assertions for each based on their concrete types. There are good examples in the documentation.
Spring Boot Test With JUnit
Spring Boot Test With JUnit 4
<dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <!-- Starting with Spring Boot 2.4, JUnit 5’s vintage engine has been removed from spring-boot-starter-test. If we still want to write tests using JUnit 4, we need to add the following Maven dependency --> <dependency> <groupId>org.junit.vintage</groupId> <artifactId>junit-vintage-engine</artifactId> <scope>test</scope> <exclusions> <exclusion> <groupId>org.hamcrest</groupId> <artifactId>hamcrest-core</artifactId> </exclusion> </exclusions> </dependency>
Although you probably won’t need to convert your old JUnit 4 tests to JUnit 5 unless you want to use new JUnit 5 features, there are compelling reasons to switch to JUnit 5.
Given two integers dividend and divisor, divide two integers without using multiplication, division, and mod operator.
The integer division should truncate toward zero, which means losing its fractional part. For example, 8.345 would be truncated to 8, and -2.7335 would be truncated to -2.
Return the quotient after dividingdividendbydivisor.
Note: Assume we are dealing with an environment that could only store integers within the 32-bit signed integer range: [−2^31, 2^31 − 1]. For this problem, if the quotient is strictly greater than2^31 - 1, then return 2^31 - 1, and if the quotient is strictly less than-2^31, then return -2^31.
Example 1:
Input: dividend = 10, divisor = 3 Output: 3 Explanation: 10/3 = 3.33333.. which is truncated to 3.
Example 2:
Input: dividend = 7, divisor = -3 Output: -2 Explanation: 7/-3 = -2.33333.. which is truncated to -2.
Constraints:
-2^31 <= dividend, divisor <= 2^31 - 1
divisor != 0
Related Topics
Math
Bit Manipulation
Analysis
set quotient = 0 n ∈ N when divisor * 2 ^ n <= dividend < divisor * 2 ^ (n+1) quotient = quotient + (2 ^ n) dividend = dividend - (divisor ^ n) when divisor <= dividend < divisor * 2 quotient = quotient + 1 dividend = dividend - divisor when dividend < divisor return quotient
Solution
publicintdivide(int dividend, int divisor) { if (dividend == Integer.MIN_VALUE && divisor == -1) return Integer.MAX_VALUE; //Cornor case when -2^31 is divided by -1 will give 2^31 which doesnt exist so overflow
booleannegative= dividend < 0 ^ divisor < 0; //Logical XOR will help in deciding if the results is negative only if any one of them is negative